2026-03-23T17:26:56.906 INFO:root:teuthology version: 1.2.4.dev6+g1c580df7a 2026-03-23T17:26:56.913 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-23T17:26:56.934 INFO:teuthology.run:Config: archive_path: /archive/kyr-2026-03-20_22:04:26-rbd-tentacle-none-default-vps/3501 branch: tentacle description: rbd/cli/{base/install clusters/{fixed-1} conf/{disable-pool-app} data-pool/ec features/defaults msgr-failures/few objectstore/bluestore-comp-zstd supported-random-distro$/{ubuntu_latest} workloads/rbd_cli_generic} email: null first_in_suite: false flavor: default job_id: '3501' ktype: distro last_in_suite: false machine_type: vps name: kyr-2026-03-20_22:04:26-rbd-tentacle-none-default-vps no_nested_subset: false os_type: ubuntu os_version: '22.04' overrides: admin_socket: branch: tentacle ansible.cephlab: branch: main repo: https://github.com/kshtsk/ceph-cm-ansible.git skip_tags: nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs vars: logical_volumes: lv_1: scratch_dev: true size: 25%VG vg: vg_nvme lv_2: scratch_dev: true size: 25%VG vg: vg_nvme lv_3: scratch_dev: true size: 25%VG vg: vg_nvme lv_4: scratch_dev: true size: 25%VG vg: vg_nvme timezone: UTC volume_groups: vg_nvme: pvs: /dev/vdb,/dev/vdc,/dev/vdd,/dev/vde ceph: conf: client: rbd default data pool: datapool rbd default features: 61 global: mon client directed command retry: 5 mon warn on pool no app: false ms inject socket failures: 5000 mgr: debug mgr: 20 debug ms: 1 mon: debug mon: 20 debug ms: 1 debug paxos: 20 osd: bluestore block size: 96636764160 bluestore compression algorithm: zstd bluestore compression mode: aggressive bluestore fsck on mount: true debug bluefs: 1/20 debug bluestore: 1/20 debug ms: 1 debug osd: 20 debug rocksdb: 4/10 enable experimental unrecoverable data corrupting features: '*' mon osd backfillfull_ratio: 0.85 mon osd full ratio: 0.9 mon osd nearfull ratio: 0.8 osd debug randomize hobject sort order: false osd failsafe full ratio: 0.95 osd mclock iops capacity threshold hdd: 49000 osd objectstore: bluestore osd shutdown pgref assert: true flavor: default fs: xfs log-ignorelist: - \(MDS_ALL_DOWN\) - \(MDS_UP_LESS_THAN_MAX\) - \(OSD_SLOW_PING_TIME sha1: 70f8415b300f041766fa27faf7d5472699e32388 ceph-deploy: conf: client: log file: /var/log/ceph/ceph-$name.$pid.log global: osd crush chooseleaf type: 0 osd pool default pg num: 128 osd pool default pgp num: 128 osd pool default size: 2 mon: {} cephadm: cephadm_binary_url: https://download.ceph.com/rpm-20.2.0/el9/noarch/cephadm install: ceph: flavor: default sha1: 70f8415b300f041766fa27faf7d5472699e32388 extra_system_packages: deb: - python3-jmespath - python3-xmltodict - s3cmd rpm: - bzip2 - perl-Test-Harness - python3-jmespath - python3-xmltodict - s3cmd thrashosds: bdev_inject_crash: 2 bdev_inject_crash_probability: 0.5 workunit: branch: tt-tentacle sha1: 0392f78529848ec72469e8e431875cb98d3a5fb4 owner: kyr priority: 1000 repo: https://github.com/ceph/ceph.git roles: - - mon.a - mgr.x - osd.0 - osd.1 - osd.2 - client.0 seed: 3051 sha1: 70f8415b300f041766fa27faf7d5472699e32388 sleep_before_teardown: 0 subset: 1/128 suite: rbd suite_branch: tt-tentacle suite_path: /home/teuthos/src/github.com_kshtsk_ceph_0392f78529848ec72469e8e431875cb98d3a5fb4/qa suite_relpath: qa suite_repo: https://github.com/kshtsk/ceph.git suite_sha1: 0392f78529848ec72469e8e431875cb98d3a5fb4 targets: vm04.local: ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBJoXz0D/Nu8/hzq2ZZlacEgnDdWCoJRGl3xE/MAUqns0Wim/v+eJZAklj7iE3Nx/DuH33O/sKetP0pALV/8LkE= tasks: - install: null - ceph: null - exec: client.0: - sudo ceph osd erasure-code-profile set teuthologyprofile crush-failure-domain=osd m=1 k=2 - sudo ceph osd pool create datapool 4 4 erasure teuthologyprofile - sudo ceph osd pool set datapool allow_ec_overwrites true - rbd pool init datapool - workunit: clients: client.0: - rbd/cli_generic.sh teuthology: fragments_dropped: [] meta: {} postmerge: [] teuthology_branch: clyso-debian-13 teuthology_repo: https://github.com/clyso/teuthology teuthology_sha1: 1c580df7a9c7c2aadc272da296344fd99f27c444 timestamp: 2026-03-20_22:04:26 tube: vps user: kyr verbose: false worker_log: /home/teuthos/.teuthology/dispatcher/dispatcher.vps.2366871 2026-03-23T17:26:56.934 INFO:teuthology.run:suite_path is set to /home/teuthos/src/github.com_kshtsk_ceph_0392f78529848ec72469e8e431875cb98d3a5fb4/qa; will attempt to use it 2026-03-23T17:26:56.934 INFO:teuthology.run:Found tasks at /home/teuthos/src/github.com_kshtsk_ceph_0392f78529848ec72469e8e431875cb98d3a5fb4/qa/tasks 2026-03-23T17:26:56.934 INFO:teuthology.run_tasks:Running task internal.check_packages... 2026-03-23T17:26:56.935 INFO:teuthology.task.internal:Checking packages... 2026-03-23T17:26:56.935 INFO:teuthology.task.internal:Checking packages for os_type 'ubuntu', flavor 'default' and ceph hash '70f8415b300f041766fa27faf7d5472699e32388' 2026-03-23T17:26:56.935 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using branch 2026-03-23T17:26:56.935 INFO:teuthology.packaging:ref: None 2026-03-23T17:26:56.935 INFO:teuthology.packaging:tag: None 2026-03-23T17:26:56.935 INFO:teuthology.packaging:branch: tentacle 2026-03-23T17:26:56.935 INFO:teuthology.packaging:sha1: 70f8415b300f041766fa27faf7d5472699e32388 2026-03-23T17:26:56.935 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=ubuntu%2F22.04%2Fx86_64&ref=tentacle 2026-03-23T17:26:57.746 INFO:teuthology.task.internal:Found packages for ceph version 20.2.0-714-g147f7c6a-1jammy 2026-03-23T17:26:57.747 INFO:teuthology.run_tasks:Running task internal.buildpackages_prep... 2026-03-23T17:26:57.748 INFO:teuthology.task.internal:no buildpackages task found 2026-03-23T17:26:57.748 INFO:teuthology.run_tasks:Running task internal.save_config... 2026-03-23T17:26:57.749 INFO:teuthology.task.internal:Saving configuration 2026-03-23T17:26:57.755 INFO:teuthology.run_tasks:Running task internal.check_lock... 2026-03-23T17:26:57.756 INFO:teuthology.task.internal.check_lock:Checking locks... 2026-03-23T17:26:57.763 DEBUG:teuthology.task.internal.check_lock:machine status is {'name': 'vm04.local', 'description': '/archive/kyr-2026-03-20_22:04:26-rbd-tentacle-none-default-vps/3501', 'up': True, 'machine_type': 'vps', 'is_vm': True, 'vm_host': {'name': 'localhost', 'description': None, 'up': True, 'machine_type': 'libvirt', 'is_vm': False, 'vm_host': None, 'os_type': None, 'os_version': None, 'arch': None, 'locked': True, 'locked_since': None, 'locked_by': None, 'mac_address': None, 'ssh_pub_key': None}, 'os_type': 'ubuntu', 'os_version': '22.04', 'arch': 'x86_64', 'locked': True, 'locked_since': '2026-03-23 17:26:14.020361', 'locked_by': 'kyr', 'mac_address': '52:55:00:00:00:04', 'ssh_pub_key': 'ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBJoXz0D/Nu8/hzq2ZZlacEgnDdWCoJRGl3xE/MAUqns0Wim/v+eJZAklj7iE3Nx/DuH33O/sKetP0pALV/8LkE='} 2026-03-23T17:26:57.763 INFO:teuthology.run_tasks:Running task internal.add_remotes... 2026-03-23T17:26:57.764 INFO:teuthology.task.internal:roles: ubuntu@vm04.local - ['mon.a', 'mgr.x', 'osd.0', 'osd.1', 'osd.2', 'client.0'] 2026-03-23T17:26:57.764 INFO:teuthology.run_tasks:Running task console_log... 2026-03-23T17:26:57.772 DEBUG:teuthology.task.console_log:vm04 does not support IPMI; excluding 2026-03-23T17:26:57.772 DEBUG:teuthology.exit:Installing handler: Handler(exiter=, func=.kill_console_loggers at 0x7fc69f7a5000>, signals=[15]) 2026-03-23T17:26:57.772 INFO:teuthology.run_tasks:Running task internal.connect... 2026-03-23T17:26:57.773 INFO:teuthology.task.internal:Opening connections... 2026-03-23T17:26:57.773 DEBUG:teuthology.task.internal:connecting to ubuntu@vm04.local 2026-03-23T17:26:57.774 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm04.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-23T17:26:57.835 INFO:teuthology.run_tasks:Running task internal.push_inventory... 2026-03-23T17:26:57.836 DEBUG:teuthology.orchestra.run.vm04:> uname -m 2026-03-23T17:26:57.965 INFO:teuthology.orchestra.run.vm04.stdout:x86_64 2026-03-23T17:26:57.965 DEBUG:teuthology.orchestra.run.vm04:> cat /etc/os-release 2026-03-23T17:26:58.010 INFO:teuthology.orchestra.run.vm04.stdout:PRETTY_NAME="Ubuntu 22.04.5 LTS" 2026-03-23T17:26:58.010 INFO:teuthology.orchestra.run.vm04.stdout:NAME="Ubuntu" 2026-03-23T17:26:58.010 INFO:teuthology.orchestra.run.vm04.stdout:VERSION_ID="22.04" 2026-03-23T17:26:58.010 INFO:teuthology.orchestra.run.vm04.stdout:VERSION="22.04.5 LTS (Jammy Jellyfish)" 2026-03-23T17:26:58.010 INFO:teuthology.orchestra.run.vm04.stdout:VERSION_CODENAME=jammy 2026-03-23T17:26:58.010 INFO:teuthology.orchestra.run.vm04.stdout:ID=ubuntu 2026-03-23T17:26:58.010 INFO:teuthology.orchestra.run.vm04.stdout:ID_LIKE=debian 2026-03-23T17:26:58.010 INFO:teuthology.orchestra.run.vm04.stdout:HOME_URL="https://www.ubuntu.com/" 2026-03-23T17:26:58.010 INFO:teuthology.orchestra.run.vm04.stdout:SUPPORT_URL="https://help.ubuntu.com/" 2026-03-23T17:26:58.010 INFO:teuthology.orchestra.run.vm04.stdout:BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/" 2026-03-23T17:26:58.010 INFO:teuthology.orchestra.run.vm04.stdout:PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy" 2026-03-23T17:26:58.010 INFO:teuthology.orchestra.run.vm04.stdout:UBUNTU_CODENAME=jammy 2026-03-23T17:26:58.010 INFO:teuthology.lock.ops:Updating vm04.local on lock server 2026-03-23T17:26:58.014 INFO:teuthology.run_tasks:Running task internal.serialize_remote_roles... 2026-03-23T17:26:58.016 INFO:teuthology.run_tasks:Running task internal.check_conflict... 2026-03-23T17:26:58.017 INFO:teuthology.task.internal:Checking for old test directory... 2026-03-23T17:26:58.017 DEBUG:teuthology.orchestra.run.vm04:> test '!' -e /home/ubuntu/cephtest 2026-03-23T17:26:58.053 INFO:teuthology.run_tasks:Running task internal.check_ceph_data... 2026-03-23T17:26:58.054 INFO:teuthology.task.internal:Checking for non-empty /var/lib/ceph... 2026-03-23T17:26:58.054 DEBUG:teuthology.orchestra.run.vm04:> test -z $(ls -A /var/lib/ceph) 2026-03-23T17:26:58.098 INFO:teuthology.orchestra.run.vm04.stderr:ls: cannot access '/var/lib/ceph': No such file or directory 2026-03-23T17:26:58.098 INFO:teuthology.run_tasks:Running task internal.vm_setup... 2026-03-23T17:26:58.105 DEBUG:teuthology.orchestra.run.vm04:> test -e /ceph-qa-ready 2026-03-23T17:26:58.141 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-23T17:26:58.418 INFO:teuthology.run_tasks:Running task internal.base... 2026-03-23T17:26:58.419 INFO:teuthology.task.internal:Creating test directory... 2026-03-23T17:26:58.419 DEBUG:teuthology.orchestra.run.vm04:> mkdir -p -m0755 -- /home/ubuntu/cephtest 2026-03-23T17:26:58.423 INFO:teuthology.run_tasks:Running task internal.archive_upload... 2026-03-23T17:26:58.424 INFO:teuthology.run_tasks:Running task internal.archive... 2026-03-23T17:26:58.425 INFO:teuthology.task.internal:Creating archive directory... 2026-03-23T17:26:58.425 DEBUG:teuthology.orchestra.run.vm04:> install -d -m0755 -- /home/ubuntu/cephtest/archive 2026-03-23T17:26:58.472 INFO:teuthology.run_tasks:Running task internal.coredump... 2026-03-23T17:26:58.473 INFO:teuthology.task.internal:Enabling coredump saving... 2026-03-23T17:26:58.473 DEBUG:teuthology.orchestra.run.vm04:> test -f /run/.containerenv -o -f /.dockerenv 2026-03-23T17:26:58.514 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-23T17:26:58.514 DEBUG:teuthology.orchestra.run.vm04:> install -d -m0755 -- /home/ubuntu/cephtest/archive/coredump && sudo sysctl -w kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core && echo kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core | sudo tee -a /etc/sysctl.conf 2026-03-23T17:26:58.564 INFO:teuthology.orchestra.run.vm04.stdout:kernel.core_pattern = /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-23T17:26:58.568 INFO:teuthology.orchestra.run.vm04.stdout:kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-23T17:26:58.569 INFO:teuthology.run_tasks:Running task internal.sudo... 2026-03-23T17:26:58.571 INFO:teuthology.task.internal:Configuring sudo... 2026-03-23T17:26:58.571 DEBUG:teuthology.orchestra.run.vm04:> sudo sed -i.orig.teuthology -e 's/^\([^#]*\) \(requiretty\)/\1 !\2/g' -e 's/^\([^#]*\) !\(visiblepw\)/\1 \2/g' /etc/sudoers 2026-03-23T17:26:58.618 INFO:teuthology.run_tasks:Running task internal.syslog... 2026-03-23T17:26:58.620 INFO:teuthology.task.internal.syslog:Starting syslog monitoring... 2026-03-23T17:26:58.620 DEBUG:teuthology.orchestra.run.vm04:> mkdir -p -m0755 -- /home/ubuntu/cephtest/archive/syslog 2026-03-23T17:26:58.662 DEBUG:teuthology.orchestra.run.vm04:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-23T17:26:58.706 DEBUG:teuthology.orchestra.run.vm04:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-23T17:26:58.750 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-23T17:26:58.750 DEBUG:teuthology.orchestra.run.vm04:> sudo dd of=/etc/rsyslog.d/80-cephtest.conf 2026-03-23T17:26:58.800 DEBUG:teuthology.orchestra.run.vm04:> sudo service rsyslog restart 2026-03-23T17:26:58.859 INFO:teuthology.run_tasks:Running task internal.timer... 2026-03-23T17:26:58.860 INFO:teuthology.task.internal:Starting timer... 2026-03-23T17:26:58.861 INFO:teuthology.run_tasks:Running task pcp... 2026-03-23T17:26:58.863 INFO:teuthology.run_tasks:Running task selinux... 2026-03-23T17:26:58.865 INFO:teuthology.task.selinux:Excluding vm04: VMs are not yet supported 2026-03-23T17:26:58.865 DEBUG:teuthology.task.selinux:Getting current SELinux state 2026-03-23T17:26:58.865 DEBUG:teuthology.task.selinux:Existing SELinux modes: {} 2026-03-23T17:26:58.865 INFO:teuthology.task.selinux:Putting SELinux into permissive mode 2026-03-23T17:26:58.865 INFO:teuthology.run_tasks:Running task ansible.cephlab... 2026-03-23T17:26:58.867 DEBUG:teuthology.task:Applying overrides for task ansible.cephlab: {'branch': 'main', 'repo': 'https://github.com/kshtsk/ceph-cm-ansible.git', 'skip_tags': 'nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs', 'vars': {'logical_volumes': {'lv_1': {'scratch_dev': True, 'size': '25%VG', 'vg': 'vg_nvme'}, 'lv_2': {'scratch_dev': True, 'size': '25%VG', 'vg': 'vg_nvme'}, 'lv_3': {'scratch_dev': True, 'size': '25%VG', 'vg': 'vg_nvme'}, 'lv_4': {'scratch_dev': True, 'size': '25%VG', 'vg': 'vg_nvme'}}, 'timezone': 'UTC', 'volume_groups': {'vg_nvme': {'pvs': '/dev/vdb,/dev/vdc,/dev/vdd,/dev/vde'}}}} 2026-03-23T17:26:58.867 DEBUG:teuthology.repo_utils:Resetting repo at /home/teuthos/src/github.com_kshtsk_ceph-cm-ansible_main to origin/main 2026-03-23T17:26:58.873 INFO:teuthology.task.ansible:Playbook: [{'import_playbook': 'ansible_managed.yml'}, {'import_playbook': 'teuthology.yml'}, {'hosts': 'testnodes', 'tasks': [{'set_fact': {'ran_from_cephlab_playbook': True}}]}, {'import_playbook': 'testnodes.yml'}, {'import_playbook': 'container-host.yml'}, {'import_playbook': 'cobbler.yml'}, {'import_playbook': 'paddles.yml'}, {'import_playbook': 'pulpito.yml'}, {'hosts': 'testnodes', 'become': True, 'tasks': [{'name': 'Touch /ceph-qa-ready', 'file': {'path': '/ceph-qa-ready', 'state': 'touch'}, 'when': 'ran_from_cephlab_playbook|bool'}]}] 2026-03-23T17:26:58.873 DEBUG:teuthology.task.ansible:Running ansible-playbook -v --extra-vars '{"ansible_ssh_user": "ubuntu", "logical_volumes": {"lv_1": {"scratch_dev": true, "size": "25%VG", "vg": "vg_nvme"}, "lv_2": {"scratch_dev": true, "size": "25%VG", "vg": "vg_nvme"}, "lv_3": {"scratch_dev": true, "size": "25%VG", "vg": "vg_nvme"}, "lv_4": {"scratch_dev": true, "size": "25%VG", "vg": "vg_nvme"}}, "timezone": "UTC", "volume_groups": {"vg_nvme": {"pvs": "/dev/vdb,/dev/vdc,/dev/vdd,/dev/vde"}}}' -i /tmp/teuth_ansible_inventory4gdbqz1w --limit vm04.local /home/teuthos/src/github.com_kshtsk_ceph-cm-ansible_main/cephlab.yml --skip-tags nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs 2026-03-23T17:28:46.850 DEBUG:teuthology.task.ansible:Reconnecting to [Remote(name='ubuntu@vm04.local')] 2026-03-23T17:28:46.850 INFO:teuthology.orchestra.remote:Trying to reconnect to host 'ubuntu@vm04.local' 2026-03-23T17:28:46.851 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm04.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-23T17:28:46.909 DEBUG:teuthology.orchestra.run.vm04:> true 2026-03-23T17:28:47.121 INFO:teuthology.orchestra.remote:Successfully reconnected to host 'ubuntu@vm04.local' 2026-03-23T17:28:47.121 INFO:teuthology.run_tasks:Running task clock... 2026-03-23T17:28:47.123 INFO:teuthology.task.clock:Syncing clocks and checking initial clock skew... 2026-03-23T17:28:47.123 INFO:teuthology.orchestra.run:Running command with timeout 360 2026-03-23T17:28:47.123 DEBUG:teuthology.orchestra.run.vm04:> sudo systemctl stop ntp.service || sudo systemctl stop ntpd.service || sudo systemctl stop chronyd.service ; sudo ntpd -gq || sudo chronyc makestep ; sudo systemctl start ntp.service || sudo systemctl start ntpd.service || sudo systemctl start chronyd.service ; PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-23T17:28:47.179 INFO:teuthology.orchestra.run.vm04.stdout:23 Mar 17:28:47 ntpd[16219]: ntpd 4.2.8p15@1.3728-o Wed Feb 16 17:13:02 UTC 2022 (1): Starting 2026-03-23T17:28:47.179 INFO:teuthology.orchestra.run.vm04.stdout:23 Mar 17:28:47 ntpd[16219]: Command line: ntpd -gq 2026-03-23T17:28:47.179 INFO:teuthology.orchestra.run.vm04.stdout:23 Mar 17:28:47 ntpd[16219]: ---------------------------------------------------- 2026-03-23T17:28:47.179 INFO:teuthology.orchestra.run.vm04.stdout:23 Mar 17:28:47 ntpd[16219]: ntp-4 is maintained by Network Time Foundation, 2026-03-23T17:28:47.179 INFO:teuthology.orchestra.run.vm04.stdout:23 Mar 17:28:47 ntpd[16219]: Inc. (NTF), a non-profit 501(c)(3) public-benefit 2026-03-23T17:28:47.179 INFO:teuthology.orchestra.run.vm04.stdout:23 Mar 17:28:47 ntpd[16219]: corporation. Support and training for ntp-4 are 2026-03-23T17:28:47.179 INFO:teuthology.orchestra.run.vm04.stdout:23 Mar 17:28:47 ntpd[16219]: available at https://www.nwtime.org/support 2026-03-23T17:28:47.179 INFO:teuthology.orchestra.run.vm04.stdout:23 Mar 17:28:47 ntpd[16219]: ---------------------------------------------------- 2026-03-23T17:28:47.179 INFO:teuthology.orchestra.run.vm04.stdout:23 Mar 17:28:47 ntpd[16219]: proto: precision = 0.029 usec (-25) 2026-03-23T17:28:47.180 INFO:teuthology.orchestra.run.vm04.stdout:23 Mar 17:28:47 ntpd[16219]: basedate set to 2022-02-04 2026-03-23T17:28:47.180 INFO:teuthology.orchestra.run.vm04.stdout:23 Mar 17:28:47 ntpd[16219]: gps base set to 2022-02-06 (week 2196) 2026-03-23T17:28:47.180 INFO:teuthology.orchestra.run.vm04.stdout:23 Mar 17:28:47 ntpd[16219]: leapsecond file ('/usr/share/zoneinfo/leap-seconds.list'): good hash signature 2026-03-23T17:28:47.180 INFO:teuthology.orchestra.run.vm04.stdout:23 Mar 17:28:47 ntpd[16219]: leapsecond file ('/usr/share/zoneinfo/leap-seconds.list'): loaded, expire=2025-12-28T00:00:00Z last=2017-01-01T00:00:00Z ofs=37 2026-03-23T17:28:47.180 INFO:teuthology.orchestra.run.vm04.stderr:23 Mar 17:28:47 ntpd[16219]: leapsecond file ('/usr/share/zoneinfo/leap-seconds.list'): expired 86 days ago 2026-03-23T17:28:47.181 INFO:teuthology.orchestra.run.vm04.stdout:23 Mar 17:28:47 ntpd[16219]: Listen and drop on 0 v6wildcard [::]:123 2026-03-23T17:28:47.181 INFO:teuthology.orchestra.run.vm04.stdout:23 Mar 17:28:47 ntpd[16219]: Listen and drop on 1 v4wildcard 0.0.0.0:123 2026-03-23T17:28:47.181 INFO:teuthology.orchestra.run.vm04.stdout:23 Mar 17:28:47 ntpd[16219]: Listen normally on 2 lo 127.0.0.1:123 2026-03-23T17:28:47.181 INFO:teuthology.orchestra.run.vm04.stdout:23 Mar 17:28:47 ntpd[16219]: Listen normally on 3 ens3 192.168.123.104:123 2026-03-23T17:28:47.181 INFO:teuthology.orchestra.run.vm04.stdout:23 Mar 17:28:47 ntpd[16219]: Listen normally on 4 lo [::1]:123 2026-03-23T17:28:47.181 INFO:teuthology.orchestra.run.vm04.stdout:23 Mar 17:28:47 ntpd[16219]: Listen normally on 5 ens3 [fe80::5055:ff:fe00:4%2]:123 2026-03-23T17:28:47.181 INFO:teuthology.orchestra.run.vm04.stdout:23 Mar 17:28:47 ntpd[16219]: Listening on routing socket on fd #22 for interface updates 2026-03-23T17:28:48.181 INFO:teuthology.orchestra.run.vm04.stdout:23 Mar 17:28:48 ntpd[16219]: Soliciting pool server 130.61.133.198 2026-03-23T17:28:49.180 INFO:teuthology.orchestra.run.vm04.stdout:23 Mar 17:28:49 ntpd[16219]: Soliciting pool server 213.172.105.106 2026-03-23T17:28:49.180 INFO:teuthology.orchestra.run.vm04.stdout:23 Mar 17:28:49 ntpd[16219]: Soliciting pool server 5.9.193.27 2026-03-23T17:28:50.180 INFO:teuthology.orchestra.run.vm04.stdout:23 Mar 17:28:50 ntpd[16219]: Soliciting pool server 129.70.132.32 2026-03-23T17:28:50.180 INFO:teuthology.orchestra.run.vm04.stdout:23 Mar 17:28:50 ntpd[16219]: Soliciting pool server 185.252.140.125 2026-03-23T17:28:50.181 INFO:teuthology.orchestra.run.vm04.stdout:23 Mar 17:28:50 ntpd[16219]: Soliciting pool server 157.230.22.48 2026-03-23T17:28:51.180 INFO:teuthology.orchestra.run.vm04.stdout:23 Mar 17:28:51 ntpd[16219]: Soliciting pool server 129.70.132.35 2026-03-23T17:28:51.180 INFO:teuthology.orchestra.run.vm04.stdout:23 Mar 17:28:51 ntpd[16219]: Soliciting pool server 131.188.3.222 2026-03-23T17:28:51.180 INFO:teuthology.orchestra.run.vm04.stdout:23 Mar 17:28:51 ntpd[16219]: Soliciting pool server 172.104.154.182 2026-03-23T17:28:51.180 INFO:teuthology.orchestra.run.vm04.stdout:23 Mar 17:28:51 ntpd[16219]: Soliciting pool server 82.165.178.31 2026-03-23T17:28:52.180 INFO:teuthology.orchestra.run.vm04.stdout:23 Mar 17:28:52 ntpd[16219]: Soliciting pool server 136.243.177.133 2026-03-23T17:28:52.180 INFO:teuthology.orchestra.run.vm04.stdout:23 Mar 17:28:52 ntpd[16219]: Soliciting pool server 162.159.200.1 2026-03-23T17:28:52.180 INFO:teuthology.orchestra.run.vm04.stdout:23 Mar 17:28:52 ntpd[16219]: Soliciting pool server 85.220.190.246 2026-03-23T17:28:52.180 INFO:teuthology.orchestra.run.vm04.stdout:23 Mar 17:28:52 ntpd[16219]: Soliciting pool server 185.125.190.57 2026-03-23T17:28:53.180 INFO:teuthology.orchestra.run.vm04.stdout:23 Mar 17:28:53 ntpd[16219]: Soliciting pool server 185.125.190.56 2026-03-23T17:28:53.180 INFO:teuthology.orchestra.run.vm04.stdout:23 Mar 17:28:53 ntpd[16219]: Soliciting pool server 217.115.11.162 2026-03-23T17:28:53.180 INFO:teuthology.orchestra.run.vm04.stdout:23 Mar 17:28:53 ntpd[16219]: Soliciting pool server 79.133.44.141 2026-03-23T17:28:55.269 INFO:teuthology.orchestra.run.vm04.stdout:23 Mar 17:28:55 ntpd[16219]: ntpd: time slew -0.008462 s 2026-03-23T17:28:55.269 INFO:teuthology.orchestra.run.vm04.stdout:ntpd: time slew -0.008462s 2026-03-23T17:28:55.289 INFO:teuthology.orchestra.run.vm04.stdout: remote refid st t when poll reach delay offset jitter 2026-03-23T17:28:55.289 INFO:teuthology.orchestra.run.vm04.stdout:============================================================================== 2026-03-23T17:28:55.289 INFO:teuthology.orchestra.run.vm04.stdout: 0.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-23T17:28:55.289 INFO:teuthology.orchestra.run.vm04.stdout: 1.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-23T17:28:55.289 INFO:teuthology.orchestra.run.vm04.stdout: 2.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-23T17:28:55.289 INFO:teuthology.orchestra.run.vm04.stdout: 3.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-23T17:28:55.289 INFO:teuthology.orchestra.run.vm04.stdout: ntp.ubuntu.com .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-23T17:28:55.289 INFO:teuthology.run_tasks:Running task install... 2026-03-23T17:28:55.291 DEBUG:teuthology.task.install:project ceph 2026-03-23T17:28:55.291 DEBUG:teuthology.task.install:INSTALL overrides: {'ceph': {'flavor': 'default', 'sha1': '70f8415b300f041766fa27faf7d5472699e32388'}, 'extra_system_packages': {'deb': ['python3-jmespath', 'python3-xmltodict', 's3cmd'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-jmespath', 'python3-xmltodict', 's3cmd']}} 2026-03-23T17:28:55.291 DEBUG:teuthology.task.install:config {'flavor': 'default', 'sha1': '70f8415b300f041766fa27faf7d5472699e32388', 'extra_system_packages': {'deb': ['python3-jmespath', 'python3-xmltodict', 's3cmd'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-jmespath', 'python3-xmltodict', 's3cmd']}} 2026-03-23T17:28:55.291 INFO:teuthology.task.install:Using flavor: default 2026-03-23T17:28:55.293 DEBUG:teuthology.task.install:Package list is: {'deb': ['ceph', 'cephadm', 'ceph-mds', 'ceph-mgr', 'ceph-common', 'ceph-fuse', 'ceph-test', 'ceph-volume', 'radosgw', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'libcephfs2', 'libcephfs-dev', 'librados2', 'librbd1', 'rbd-fuse'], 'rpm': ['ceph-radosgw', 'ceph-test', 'ceph', 'ceph-base', 'cephadm', 'ceph-immutable-object-cache', 'ceph-mgr', 'ceph-mgr-dashboard', 'ceph-mgr-diskprediction-local', 'ceph-mgr-rook', 'ceph-mgr-cephadm', 'ceph-fuse', 'ceph-volume', 'librados-devel', 'libcephfs2', 'libcephfs-devel', 'librados2', 'librbd1', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'rbd-fuse', 'rbd-mirror', 'rbd-nbd']} 2026-03-23T17:28:55.293 INFO:teuthology.task.install:extra packages: [] 2026-03-23T17:28:55.294 DEBUG:teuthology.orchestra.run.vm04:> sudo apt-key list | grep Ceph 2026-03-23T17:28:55.370 INFO:teuthology.orchestra.run.vm04.stderr:Warning: apt-key is deprecated. Manage keyring files in trusted.gpg.d instead (see apt-key(8)). 2026-03-23T17:28:55.390 INFO:teuthology.orchestra.run.vm04.stdout:uid [ unknown] Ceph automated package build (Ceph automated package build) 2026-03-23T17:28:55.390 INFO:teuthology.orchestra.run.vm04.stdout:uid [ unknown] Ceph.com (release key) 2026-03-23T17:28:55.390 INFO:teuthology.task.install.deb:Installing packages: ceph, cephadm, ceph-mds, ceph-mgr, ceph-common, ceph-fuse, ceph-test, ceph-volume, radosgw, python3-rados, python3-rgw, python3-cephfs, python3-rbd, libcephfs2, libcephfs-dev, librados2, librbd1, rbd-fuse on remote deb x86_64 2026-03-23T17:28:55.390 INFO:teuthology.task.install.deb:Installing system (non-project) packages: python3-jmespath, python3-xmltodict, s3cmd on remote deb x86_64 2026-03-23T17:28:55.390 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=ubuntu%2F22.04%2Fx86_64&sha1=70f8415b300f041766fa27faf7d5472699e32388 2026-03-23T17:28:56.006 INFO:teuthology.task.install.deb:Pulling from https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default/ 2026-03-23T17:28:56.006 INFO:teuthology.task.install.deb:Package version is 20.2.0-712-g70f8415b-1jammy 2026-03-23T17:28:56.462 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-23T17:28:56.462 DEBUG:teuthology.orchestra.run.vm04:> sudo dd of=/etc/apt/sources.list.d/ceph.list 2026-03-23T17:28:56.471 DEBUG:teuthology.orchestra.run.vm04:> sudo apt-get update 2026-03-23T17:28:56.591 INFO:teuthology.orchestra.run.vm04.stdout:Hit:1 http://archive.ubuntu.com/ubuntu jammy InRelease 2026-03-23T17:28:56.594 INFO:teuthology.orchestra.run.vm04.stdout:Hit:2 http://archive.ubuntu.com/ubuntu jammy-updates InRelease 2026-03-23T17:28:56.601 INFO:teuthology.orchestra.run.vm04.stdout:Hit:3 http://archive.ubuntu.com/ubuntu jammy-backports InRelease 2026-03-23T17:28:56.645 INFO:teuthology.orchestra.run.vm04.stdout:Hit:4 http://security.ubuntu.com/ubuntu jammy-security InRelease 2026-03-23T17:28:57.146 INFO:teuthology.orchestra.run.vm04.stdout:Ign:5 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy InRelease 2026-03-23T17:28:57.883 INFO:teuthology.orchestra.run.vm04.stdout:Get:6 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy Release [7680 B] 2026-03-23T17:28:58.024 INFO:teuthology.orchestra.run.vm04.stdout:Ign:7 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy Release.gpg 2026-03-23T17:28:58.180 INFO:teuthology.orchestra.run.vm04.stdout:Get:8 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 Packages [18.8 kB] 2026-03-23T17:28:58.261 INFO:teuthology.orchestra.run.vm04.stdout:Fetched 26.5 kB in 2s (16.3 kB/s) 2026-03-23T17:28:58.939 INFO:teuthology.orchestra.run.vm04.stdout:Reading package lists... 2026-03-23T17:28:58.953 DEBUG:teuthology.orchestra.run.vm04:> sudo DEBIAN_FRONTEND=noninteractive apt-get -y --force-yes -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" install ceph=20.2.0-712-g70f8415b-1jammy cephadm=20.2.0-712-g70f8415b-1jammy ceph-mds=20.2.0-712-g70f8415b-1jammy ceph-mgr=20.2.0-712-g70f8415b-1jammy ceph-common=20.2.0-712-g70f8415b-1jammy ceph-fuse=20.2.0-712-g70f8415b-1jammy ceph-test=20.2.0-712-g70f8415b-1jammy ceph-volume=20.2.0-712-g70f8415b-1jammy radosgw=20.2.0-712-g70f8415b-1jammy python3-rados=20.2.0-712-g70f8415b-1jammy python3-rgw=20.2.0-712-g70f8415b-1jammy python3-cephfs=20.2.0-712-g70f8415b-1jammy python3-rbd=20.2.0-712-g70f8415b-1jammy libcephfs2=20.2.0-712-g70f8415b-1jammy libcephfs-dev=20.2.0-712-g70f8415b-1jammy librados2=20.2.0-712-g70f8415b-1jammy librbd1=20.2.0-712-g70f8415b-1jammy rbd-fuse=20.2.0-712-g70f8415b-1jammy 2026-03-23T17:28:58.987 INFO:teuthology.orchestra.run.vm04.stdout:Reading package lists... 2026-03-23T17:28:59.178 INFO:teuthology.orchestra.run.vm04.stdout:Building dependency tree... 2026-03-23T17:28:59.178 INFO:teuthology.orchestra.run.vm04.stdout:Reading state information... 2026-03-23T17:28:59.323 INFO:teuthology.orchestra.run.vm04.stdout:The following packages were automatically installed and are no longer required: 2026-03-23T17:28:59.323 INFO:teuthology.orchestra.run.vm04.stdout: kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-23T17:28:59.323 INFO:teuthology.orchestra.run.vm04.stdout: libsgutils2-2 sg3-utils sg3-utils-udev 2026-03-23T17:28:59.323 INFO:teuthology.orchestra.run.vm04.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-23T17:28:59.324 INFO:teuthology.orchestra.run.vm04.stdout:The following additional packages will be installed: 2026-03-23T17:28:59.324 INFO:teuthology.orchestra.run.vm04.stdout: ceph-base ceph-mgr-cephadm ceph-mgr-dashboard ceph-mgr-diskprediction-local 2026-03-23T17:28:59.324 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-k8sevents ceph-mgr-modules-core ceph-mon ceph-osd jq 2026-03-23T17:28:59.324 INFO:teuthology.orchestra.run.vm04.stdout: libcephfs-daemon libcephfs-proxy2 libdouble-conversion3 libfuse2 libjq1 2026-03-23T17:28:59.324 INFO:teuthology.orchestra.run.vm04.stdout: liblttng-ust1 libnbd0 liboath0 libonig5 libpcre2-16-0 libqt5core5a 2026-03-23T17:28:59.324 INFO:teuthology.orchestra.run.vm04.stdout: libqt5dbus5 libqt5network5 libradosstriper1 librdkafka1 librgw2 2026-03-23T17:28:59.324 INFO:teuthology.orchestra.run.vm04.stdout: libsqlite3-mod-ceph libthrift-0.16.0 nvme-cli python-asyncssh-doc 2026-03-23T17:28:59.324 INFO:teuthology.orchestra.run.vm04.stdout: python3-asyncssh python3-cachetools python3-ceph-argparse 2026-03-23T17:28:59.324 INFO:teuthology.orchestra.run.vm04.stdout: python3-ceph-common python3-cheroot python3-cherrypy3 python3-google-auth 2026-03-23T17:28:59.324 INFO:teuthology.orchestra.run.vm04.stdout: python3-iniconfig python3-jaraco.classes python3-jaraco.collections 2026-03-23T17:28:59.324 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-23T17:28:59.324 INFO:teuthology.orchestra.run.vm04.stdout: python3-kubernetes python3-natsort python3-pluggy python3-portend 2026-03-23T17:28:59.324 INFO:teuthology.orchestra.run.vm04.stdout: python3-prettytable python3-psutil python3-py python3-pygments 2026-03-23T17:28:59.324 INFO:teuthology.orchestra.run.vm04.stdout: python3-pytest python3-repoze.lru python3-requests-oauthlib python3-routes 2026-03-23T17:28:59.324 INFO:teuthology.orchestra.run.vm04.stdout: python3-rsa python3-simplejson python3-sklearn python3-sklearn-lib 2026-03-23T17:28:59.324 INFO:teuthology.orchestra.run.vm04.stdout: python3-tempora python3-threadpoolctl python3-toml python3-wcwidth 2026-03-23T17:28:59.324 INFO:teuthology.orchestra.run.vm04.stdout: python3-webob python3-websocket python3-zc.lockfile qttranslations5-l10n 2026-03-23T17:28:59.324 INFO:teuthology.orchestra.run.vm04.stdout: smartmontools socat xmlstarlet 2026-03-23T17:28:59.325 INFO:teuthology.orchestra.run.vm04.stdout:Suggested packages: 2026-03-23T17:28:59.325 INFO:teuthology.orchestra.run.vm04.stdout: python3-influxdb liblua5.3-dev luarocks python-natsort-doc python-psutil-doc 2026-03-23T17:28:59.325 INFO:teuthology.orchestra.run.vm04.stdout: subversion python-pygments-doc ttf-bitstream-vera python3-paste python3-dap 2026-03-23T17:28:59.325 INFO:teuthology.orchestra.run.vm04.stdout: python-sklearn-doc ipython3 python-webob-doc gsmartcontrol smart-notifier 2026-03-23T17:28:59.325 INFO:teuthology.orchestra.run.vm04.stdout: mailx | mailutils 2026-03-23T17:28:59.325 INFO:teuthology.orchestra.run.vm04.stdout:Recommended packages: 2026-03-23T17:28:59.325 INFO:teuthology.orchestra.run.vm04.stdout: btrfs-tools 2026-03-23T17:28:59.367 INFO:teuthology.orchestra.run.vm04.stdout:The following NEW packages will be installed: 2026-03-23T17:28:59.367 INFO:teuthology.orchestra.run.vm04.stdout: ceph ceph-base ceph-common ceph-fuse ceph-mds ceph-mgr ceph-mgr-cephadm 2026-03-23T17:28:59.367 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-k8sevents 2026-03-23T17:28:59.367 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-modules-core ceph-mon ceph-osd ceph-test ceph-volume cephadm jq 2026-03-23T17:28:59.367 INFO:teuthology.orchestra.run.vm04.stdout: libcephfs-daemon libcephfs-dev libcephfs-proxy2 libcephfs2 2026-03-23T17:28:59.367 INFO:teuthology.orchestra.run.vm04.stdout: libdouble-conversion3 libfuse2 libjq1 liblttng-ust1 libnbd0 liboath0 2026-03-23T17:28:59.367 INFO:teuthology.orchestra.run.vm04.stdout: libonig5 libpcre2-16-0 libqt5core5a libqt5dbus5 libqt5network5 2026-03-23T17:28:59.367 INFO:teuthology.orchestra.run.vm04.stdout: libradosstriper1 librdkafka1 librgw2 libsqlite3-mod-ceph libthrift-0.16.0 2026-03-23T17:28:59.367 INFO:teuthology.orchestra.run.vm04.stdout: nvme-cli python-asyncssh-doc python3-asyncssh python3-cachetools 2026-03-23T17:28:59.367 INFO:teuthology.orchestra.run.vm04.stdout: python3-ceph-argparse python3-ceph-common python3-cephfs python3-cheroot 2026-03-23T17:28:59.367 INFO:teuthology.orchestra.run.vm04.stdout: python3-cherrypy3 python3-google-auth python3-iniconfig 2026-03-23T17:28:59.367 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco.classes python3-jaraco.collections python3-jaraco.functools 2026-03-23T17:28:59.367 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco.text python3-joblib python3-kubernetes python3-natsort 2026-03-23T17:28:59.367 INFO:teuthology.orchestra.run.vm04.stdout: python3-pluggy python3-portend python3-prettytable python3-psutil python3-py 2026-03-23T17:28:59.368 INFO:teuthology.orchestra.run.vm04.stdout: python3-pygments python3-pytest python3-rados python3-rbd python3-repoze.lru 2026-03-23T17:28:59.368 INFO:teuthology.orchestra.run.vm04.stdout: python3-requests-oauthlib python3-rgw python3-routes python3-rsa 2026-03-23T17:28:59.368 INFO:teuthology.orchestra.run.vm04.stdout: python3-simplejson python3-sklearn python3-sklearn-lib python3-tempora 2026-03-23T17:28:59.368 INFO:teuthology.orchestra.run.vm04.stdout: python3-threadpoolctl python3-toml python3-wcwidth python3-webob 2026-03-23T17:28:59.368 INFO:teuthology.orchestra.run.vm04.stdout: python3-websocket python3-zc.lockfile qttranslations5-l10n radosgw rbd-fuse 2026-03-23T17:28:59.368 INFO:teuthology.orchestra.run.vm04.stdout: smartmontools socat xmlstarlet 2026-03-23T17:28:59.368 INFO:teuthology.orchestra.run.vm04.stdout:The following packages will be upgraded: 2026-03-23T17:28:59.368 INFO:teuthology.orchestra.run.vm04.stdout: librados2 librbd1 2026-03-23T17:28:59.455 INFO:teuthology.orchestra.run.vm04.stdout:2 upgraded, 85 newly installed, 0 to remove and 37 not upgraded. 2026-03-23T17:28:59.455 INFO:teuthology.orchestra.run.vm04.stdout:Need to get 281 MB of archives. 2026-03-23T17:28:59.455 INFO:teuthology.orchestra.run.vm04.stdout:After this operation, 1092 MB of additional disk space will be used. 2026-03-23T17:28:59.455 INFO:teuthology.orchestra.run.vm04.stdout:Get:1 http://archive.ubuntu.com/ubuntu jammy/main amd64 liblttng-ust1 amd64 2.13.1-1ubuntu1 [190 kB] 2026-03-23T17:28:59.634 INFO:teuthology.orchestra.run.vm04.stdout:Get:2 http://archive.ubuntu.com/ubuntu jammy/universe amd64 libdouble-conversion3 amd64 3.1.7-4 [39.0 kB] 2026-03-23T17:28:59.639 INFO:teuthology.orchestra.run.vm04.stdout:Get:3 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 libpcre2-16-0 amd64 10.39-3ubuntu0.1 [203 kB] 2026-03-23T17:28:59.674 INFO:teuthology.orchestra.run.vm04.stdout:Get:4 http://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 libqt5core5a amd64 5.15.3+dfsg-2ubuntu0.2 [2006 kB] 2026-03-23T17:28:59.775 INFO:teuthology.orchestra.run.vm04.stdout:Get:5 http://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 libqt5dbus5 amd64 5.15.3+dfsg-2ubuntu0.2 [222 kB] 2026-03-23T17:28:59.780 INFO:teuthology.orchestra.run.vm04.stdout:Get:6 http://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 libqt5network5 amd64 5.15.3+dfsg-2ubuntu0.2 [731 kB] 2026-03-23T17:28:59.793 INFO:teuthology.orchestra.run.vm04.stdout:Get:7 http://archive.ubuntu.com/ubuntu jammy/universe amd64 libthrift-0.16.0 amd64 0.16.0-2 [267 kB] 2026-03-23T17:28:59.797 INFO:teuthology.orchestra.run.vm04.stdout:Get:8 http://archive.ubuntu.com/ubuntu jammy/universe amd64 libnbd0 amd64 1.10.5-1 [71.3 kB] 2026-03-23T17:28:59.798 INFO:teuthology.orchestra.run.vm04.stdout:Get:9 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-wcwidth all 0.2.5+dfsg1-1 [21.9 kB] 2026-03-23T17:28:59.798 INFO:teuthology.orchestra.run.vm04.stdout:Get:10 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-prettytable all 2.5.0-2 [31.3 kB] 2026-03-23T17:28:59.799 INFO:teuthology.orchestra.run.vm04.stdout:Get:11 http://archive.ubuntu.com/ubuntu jammy/universe amd64 librdkafka1 amd64 1.8.0-1build1 [633 kB] 2026-03-23T17:28:59.807 INFO:teuthology.orchestra.run.vm04.stdout:Get:12 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 liboath0 amd64 2.6.7-3ubuntu0.1 [41.3 kB] 2026-03-23T17:28:59.808 INFO:teuthology.orchestra.run.vm04.stdout:Get:13 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-jaraco.functools all 3.4.0-2 [9030 B] 2026-03-23T17:28:59.811 INFO:teuthology.orchestra.run.vm04.stdout:Get:14 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-cheroot all 8.5.2+ds1-1ubuntu3.1 [71.1 kB] 2026-03-23T17:28:59.844 INFO:teuthology.orchestra.run.vm04.stdout:Get:15 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-jaraco.classes all 3.2.1-3 [6452 B] 2026-03-23T17:28:59.844 INFO:teuthology.orchestra.run.vm04.stdout:Get:16 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-jaraco.text all 3.6.0-2 [8716 B] 2026-03-23T17:28:59.844 INFO:teuthology.orchestra.run.vm04.stdout:Get:17 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-jaraco.collections all 3.4.0-2 [11.4 kB] 2026-03-23T17:28:59.844 INFO:teuthology.orchestra.run.vm04.stdout:Get:18 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-tempora all 4.1.2-1 [14.8 kB] 2026-03-23T17:28:59.844 INFO:teuthology.orchestra.run.vm04.stdout:Get:19 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-portend all 3.0.0-1 [7240 B] 2026-03-23T17:28:59.844 INFO:teuthology.orchestra.run.vm04.stdout:Get:20 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-zc.lockfile all 2.0-1 [8980 B] 2026-03-23T17:28:59.845 INFO:teuthology.orchestra.run.vm04.stdout:Get:21 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-cherrypy3 all 18.6.1-4 [208 kB] 2026-03-23T17:28:59.846 INFO:teuthology.orchestra.run.vm04.stdout:Get:22 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-natsort all 8.0.2-1 [35.3 kB] 2026-03-23T17:28:59.846 INFO:teuthology.orchestra.run.vm04.stdout:Get:23 http://archive.ubuntu.com/ubuntu jammy/universe amd64 libfuse2 amd64 2.9.9-5ubuntu3 [90.3 kB] 2026-03-23T17:28:59.882 INFO:teuthology.orchestra.run.vm04.stdout:Get:24 http://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 python3-asyncssh all 2.5.0-1ubuntu0.1 [189 kB] 2026-03-23T17:28:59.884 INFO:teuthology.orchestra.run.vm04.stdout:Get:25 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-repoze.lru all 0.7-2 [12.1 kB] 2026-03-23T17:28:59.884 INFO:teuthology.orchestra.run.vm04.stdout:Get:26 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-routes all 2.5.1-1ubuntu1 [89.0 kB] 2026-03-23T17:28:59.884 INFO:teuthology.orchestra.run.vm04.stdout:Get:27 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-sklearn-lib amd64 0.23.2-5ubuntu6 [2058 kB] 2026-03-23T17:28:59.956 INFO:teuthology.orchestra.run.vm04.stdout:Get:28 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-joblib all 0.17.0-4ubuntu1 [204 kB] 2026-03-23T17:28:59.957 INFO:teuthology.orchestra.run.vm04.stdout:Get:29 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-threadpoolctl all 3.1.0-1 [21.3 kB] 2026-03-23T17:28:59.957 INFO:teuthology.orchestra.run.vm04.stdout:Get:30 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-sklearn all 0.23.2-5ubuntu6 [1829 kB] 2026-03-23T17:28:59.964 INFO:teuthology.orchestra.run.vm04.stdout:Get:31 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-cachetools all 5.0.0-1 [9722 B] 2026-03-23T17:28:59.964 INFO:teuthology.orchestra.run.vm04.stdout:Get:32 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-rsa all 4.8-1 [28.4 kB] 2026-03-23T17:28:59.964 INFO:teuthology.orchestra.run.vm04.stdout:Get:33 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-google-auth all 1.5.1-3 [35.7 kB] 2026-03-23T17:28:59.965 INFO:teuthology.orchestra.run.vm04.stdout:Get:34 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-requests-oauthlib all 1.3.0+ds-0.1 [18.7 kB] 2026-03-23T17:28:59.965 INFO:teuthology.orchestra.run.vm04.stdout:Get:35 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-websocket all 1.2.3-1 [34.7 kB] 2026-03-23T17:28:59.965 INFO:teuthology.orchestra.run.vm04.stdout:Get:36 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-kubernetes all 12.0.1-1ubuntu1 [353 kB] 2026-03-23T17:28:59.993 INFO:teuthology.orchestra.run.vm04.stdout:Get:37 http://archive.ubuntu.com/ubuntu jammy/main amd64 libonig5 amd64 6.9.7.1-2build1 [172 kB] 2026-03-23T17:29:00.002 INFO:teuthology.orchestra.run.vm04.stdout:Get:38 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 libjq1 amd64 1.6-2.1ubuntu3.1 [133 kB] 2026-03-23T17:29:00.003 INFO:teuthology.orchestra.run.vm04.stdout:Get:39 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 jq amd64 1.6-2.1ubuntu3.1 [52.5 kB] 2026-03-23T17:29:00.004 INFO:teuthology.orchestra.run.vm04.stdout:Get:40 http://archive.ubuntu.com/ubuntu jammy/main amd64 socat amd64 1.7.4.1-3ubuntu4 [349 kB] 2026-03-23T17:29:00.007 INFO:teuthology.orchestra.run.vm04.stdout:Get:41 http://archive.ubuntu.com/ubuntu jammy/universe amd64 xmlstarlet amd64 1.6.1-2.1 [265 kB] 2026-03-23T17:29:00.009 INFO:teuthology.orchestra.run.vm04.stdout:Get:42 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 nvme-cli amd64 1.16-3ubuntu0.3 [474 kB] 2026-03-23T17:29:00.032 INFO:teuthology.orchestra.run.vm04.stdout:Get:43 http://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 python-asyncssh-doc all 2.5.0-1ubuntu0.1 [309 kB] 2026-03-23T17:29:00.034 INFO:teuthology.orchestra.run.vm04.stdout:Get:44 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-iniconfig all 1.1.1-2 [6024 B] 2026-03-23T17:29:00.034 INFO:teuthology.orchestra.run.vm04.stdout:Get:45 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-pluggy all 0.13.0-7.1 [19.0 kB] 2026-03-23T17:29:00.034 INFO:teuthology.orchestra.run.vm04.stdout:Get:46 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-psutil amd64 5.9.0-1build1 [158 kB] 2026-03-23T17:29:00.064 INFO:teuthology.orchestra.run.vm04.stdout:Get:47 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-py all 1.10.0-1 [71.9 kB] 2026-03-23T17:29:00.064 INFO:teuthology.orchestra.run.vm04.stdout:Get:48 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-pygments all 2.11.2+dfsg-2ubuntu0.1 [750 kB] 2026-03-23T17:29:00.071 INFO:teuthology.orchestra.run.vm04.stdout:Get:49 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-toml all 0.10.2-1 [16.5 kB] 2026-03-23T17:29:00.071 INFO:teuthology.orchestra.run.vm04.stdout:Get:50 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-pytest all 6.2.5-1ubuntu2 [214 kB] 2026-03-23T17:29:00.072 INFO:teuthology.orchestra.run.vm04.stdout:Get:51 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-simplejson amd64 3.17.6-1build1 [54.7 kB] 2026-03-23T17:29:00.099 INFO:teuthology.orchestra.run.vm04.stdout:Get:52 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-webob all 1:1.8.6-1.1ubuntu0.1 [86.7 kB] 2026-03-23T17:29:00.100 INFO:teuthology.orchestra.run.vm04.stdout:Get:53 http://archive.ubuntu.com/ubuntu jammy/universe amd64 qttranslations5-l10n all 5.15.3-1 [1983 kB] 2026-03-23T17:29:00.162 INFO:teuthology.orchestra.run.vm04.stdout:Get:54 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 librbd1 amd64 20.2.0-712-g70f8415b-1jammy [2867 kB] 2026-03-23T17:29:00.167 INFO:teuthology.orchestra.run.vm04.stdout:Get:55 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 smartmontools amd64 7.2-1ubuntu0.1 [583 kB] 2026-03-23T17:29:04.616 INFO:teuthology.orchestra.run.vm04.stdout:Get:56 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 librados2 amd64 20.2.0-712-g70f8415b-1jammy [3583 kB] 2026-03-23T17:29:07.478 INFO:teuthology.orchestra.run.vm04.stdout:Get:57 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 libcephfs2 amd64 20.2.0-712-g70f8415b-1jammy [829 kB] 2026-03-23T17:29:07.846 INFO:teuthology.orchestra.run.vm04.stdout:Get:58 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 python3-rados amd64 20.2.0-712-g70f8415b-1jammy [364 kB] 2026-03-23T17:29:07.969 INFO:teuthology.orchestra.run.vm04.stdout:Get:59 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 python3-ceph-argparse all 20.2.0-712-g70f8415b-1jammy [32.8 kB] 2026-03-23T17:29:07.969 INFO:teuthology.orchestra.run.vm04.stdout:Get:60 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 python3-cephfs amd64 20.2.0-712-g70f8415b-1jammy [184 kB] 2026-03-23T17:29:08.091 INFO:teuthology.orchestra.run.vm04.stdout:Get:61 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 python3-ceph-common all 20.2.0-712-g70f8415b-1jammy [83.8 kB] 2026-03-23T17:29:08.092 INFO:teuthology.orchestra.run.vm04.stdout:Get:62 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 python3-rbd amd64 20.2.0-712-g70f8415b-1jammy [341 kB] 2026-03-23T17:29:08.215 INFO:teuthology.orchestra.run.vm04.stdout:Get:63 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 librgw2 amd64 20.2.0-712-g70f8415b-1jammy [8697 kB] 2026-03-23T17:29:10.673 INFO:teuthology.orchestra.run.vm04.stdout:Get:64 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 python3-rgw amd64 20.2.0-712-g70f8415b-1jammy [112 kB] 2026-03-23T17:29:10.674 INFO:teuthology.orchestra.run.vm04.stdout:Get:65 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 libradosstriper1 amd64 20.2.0-712-g70f8415b-1jammy [261 kB] 2026-03-23T17:29:10.793 INFO:teuthology.orchestra.run.vm04.stdout:Get:66 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 ceph-common amd64 20.2.0-712-g70f8415b-1jammy [29.3 MB] 2026-03-23T17:29:14.480 INFO:teuthology.orchestra.run.vm04.stdout:Get:67 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 ceph-base amd64 20.2.0-712-g70f8415b-1jammy [5415 kB] 2026-03-23T17:29:14.861 INFO:teuthology.orchestra.run.vm04.stdout:Get:68 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr-modules-core all 20.2.0-712-g70f8415b-1jammy [246 kB] 2026-03-23T17:29:14.862 INFO:teuthology.orchestra.run.vm04.stdout:Get:69 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 libsqlite3-mod-ceph amd64 20.2.0-712-g70f8415b-1jammy [124 kB] 2026-03-23T17:29:14.965 INFO:teuthology.orchestra.run.vm04.stdout:Get:70 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr amd64 20.2.0-712-g70f8415b-1jammy [906 kB] 2026-03-23T17:29:14.979 INFO:teuthology.orchestra.run.vm04.stdout:Get:71 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mon amd64 20.2.0-712-g70f8415b-1jammy [6399 kB] 2026-03-23T17:29:15.466 INFO:teuthology.orchestra.run.vm04.stdout:Get:72 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 ceph-osd amd64 20.2.0-712-g70f8415b-1jammy [21.7 MB] 2026-03-23T17:29:17.014 INFO:teuthology.orchestra.run.vm04.stdout:Get:73 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 ceph amd64 20.2.0-712-g70f8415b-1jammy [14.1 kB] 2026-03-23T17:29:17.014 INFO:teuthology.orchestra.run.vm04.stdout:Get:74 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 ceph-fuse amd64 20.2.0-712-g70f8415b-1jammy [955 kB] 2026-03-23T17:29:17.017 INFO:teuthology.orchestra.run.vm04.stdout:Get:75 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mds amd64 20.2.0-712-g70f8415b-1jammy [2341 kB] 2026-03-23T17:29:17.182 INFO:teuthology.orchestra.run.vm04.stdout:Get:76 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 cephadm amd64 20.2.0-712-g70f8415b-1jammy [1049 kB] 2026-03-23T17:29:17.300 INFO:teuthology.orchestra.run.vm04.stdout:Get:77 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr-cephadm all 20.2.0-712-g70f8415b-1jammy [179 kB] 2026-03-23T17:29:17.301 INFO:teuthology.orchestra.run.vm04.stdout:Get:78 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr-dashboard all 20.2.0-712-g70f8415b-1jammy [45.5 MB] 2026-03-23T17:29:21.376 INFO:teuthology.orchestra.run.vm04.stdout:Get:79 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr-diskprediction-local all 20.2.0-712-g70f8415b-1jammy [8625 kB] 2026-03-23T17:29:22.263 INFO:teuthology.orchestra.run.vm04.stdout:Get:80 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr-k8sevents all 20.2.0-712-g70f8415b-1jammy [14.2 kB] 2026-03-23T17:29:22.264 INFO:teuthology.orchestra.run.vm04.stdout:Get:81 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 ceph-test amd64 20.2.0-712-g70f8415b-1jammy [99.5 MB] 2026-03-23T17:29:55.041 INFO:teuthology.orchestra.run.vm04.stdout:Get:82 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 ceph-volume all 20.2.0-712-g70f8415b-1jammy [135 kB] 2026-03-23T17:29:55.114 INFO:teuthology.orchestra.run.vm04.stdout:Get:83 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 libcephfs-daemon amd64 20.2.0-712-g70f8415b-1jammy [43.3 kB] 2026-03-23T17:29:55.118 INFO:teuthology.orchestra.run.vm04.stdout:Get:84 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 libcephfs-proxy2 amd64 20.2.0-712-g70f8415b-1jammy [30.7 kB] 2026-03-23T17:29:55.170 INFO:teuthology.orchestra.run.vm04.stdout:Get:85 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 libcephfs-dev amd64 20.2.0-712-g70f8415b-1jammy [41.5 kB] 2026-03-23T17:29:55.216 INFO:teuthology.orchestra.run.vm04.stdout:Get:86 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 radosgw amd64 20.2.0-712-g70f8415b-1jammy [25.1 MB] 2026-03-23T17:30:10.207 INFO:teuthology.orchestra.run.vm04.stdout:Get:87 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 rbd-fuse amd64 20.2.0-712-g70f8415b-1jammy [97.9 kB] 2026-03-23T17:30:10.475 INFO:teuthology.orchestra.run.vm04.stdout:Fetched 281 MB in 1min 11s (3963 kB/s) 2026-03-23T17:30:10.645 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package liblttng-ust1:amd64. 2026-03-23T17:30:10.681 INFO:teuthology.orchestra.run.vm04.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 119262 files and directories currently installed.) 2026-03-23T17:30:10.683 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../00-liblttng-ust1_2.13.1-1ubuntu1_amd64.deb ... 2026-03-23T17:30:10.691 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking liblttng-ust1:amd64 (2.13.1-1ubuntu1) ... 2026-03-23T17:30:10.711 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package libdouble-conversion3:amd64. 2026-03-23T17:30:10.717 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../01-libdouble-conversion3_3.1.7-4_amd64.deb ... 2026-03-23T17:30:10.718 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking libdouble-conversion3:amd64 (3.1.7-4) ... 2026-03-23T17:30:10.733 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package libpcre2-16-0:amd64. 2026-03-23T17:30:10.739 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../02-libpcre2-16-0_10.39-3ubuntu0.1_amd64.deb ... 2026-03-23T17:30:10.740 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking libpcre2-16-0:amd64 (10.39-3ubuntu0.1) ... 2026-03-23T17:30:10.761 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package libqt5core5a:amd64. 2026-03-23T17:30:10.767 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../03-libqt5core5a_5.15.3+dfsg-2ubuntu0.2_amd64.deb ... 2026-03-23T17:30:10.771 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking libqt5core5a:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-23T17:30:10.813 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package libqt5dbus5:amd64. 2026-03-23T17:30:10.820 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../04-libqt5dbus5_5.15.3+dfsg-2ubuntu0.2_amd64.deb ... 2026-03-23T17:30:10.821 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking libqt5dbus5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-23T17:30:10.839 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package libqt5network5:amd64. 2026-03-23T17:30:10.845 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../05-libqt5network5_5.15.3+dfsg-2ubuntu0.2_amd64.deb ... 2026-03-23T17:30:10.846 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking libqt5network5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-23T17:30:10.871 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package libthrift-0.16.0:amd64. 2026-03-23T17:30:10.877 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../06-libthrift-0.16.0_0.16.0-2_amd64.deb ... 2026-03-23T17:30:10.878 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking libthrift-0.16.0:amd64 (0.16.0-2) ... 2026-03-23T17:30:10.903 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../07-librbd1_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-23T17:30:10.905 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking librbd1 (20.2.0-712-g70f8415b-1jammy) over (17.2.9-0ubuntu0.22.04.2) ... 2026-03-23T17:30:10.964 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../08-librados2_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-23T17:30:10.966 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking librados2 (20.2.0-712-g70f8415b-1jammy) over (17.2.9-0ubuntu0.22.04.2) ... 2026-03-23T17:30:11.022 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package libnbd0. 2026-03-23T17:30:11.028 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../09-libnbd0_1.10.5-1_amd64.deb ... 2026-03-23T17:30:11.028 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking libnbd0 (1.10.5-1) ... 2026-03-23T17:30:11.042 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package libcephfs2. 2026-03-23T17:30:11.048 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../10-libcephfs2_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-23T17:30:11.049 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking libcephfs2 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:11.073 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package python3-rados. 2026-03-23T17:30:11.079 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../11-python3-rados_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-23T17:30:11.080 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking python3-rados (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:11.099 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package python3-ceph-argparse. 2026-03-23T17:30:11.105 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../12-python3-ceph-argparse_20.2.0-712-g70f8415b-1jammy_all.deb ... 2026-03-23T17:30:11.106 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking python3-ceph-argparse (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:11.121 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package python3-cephfs. 2026-03-23T17:30:11.127 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../13-python3-cephfs_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-23T17:30:11.128 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking python3-cephfs (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:11.146 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package python3-ceph-common. 2026-03-23T17:30:11.152 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../14-python3-ceph-common_20.2.0-712-g70f8415b-1jammy_all.deb ... 2026-03-23T17:30:11.153 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking python3-ceph-common (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:11.174 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package python3-wcwidth. 2026-03-23T17:30:11.182 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../15-python3-wcwidth_0.2.5+dfsg1-1_all.deb ... 2026-03-23T17:30:11.182 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking python3-wcwidth (0.2.5+dfsg1-1) ... 2026-03-23T17:30:11.198 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package python3-prettytable. 2026-03-23T17:30:11.204 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../16-python3-prettytable_2.5.0-2_all.deb ... 2026-03-23T17:30:11.205 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking python3-prettytable (2.5.0-2) ... 2026-03-23T17:30:11.218 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package python3-rbd. 2026-03-23T17:30:11.223 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../17-python3-rbd_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-23T17:30:11.224 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking python3-rbd (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:11.241 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package librdkafka1:amd64. 2026-03-23T17:30:11.245 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../18-librdkafka1_1.8.0-1build1_amd64.deb ... 2026-03-23T17:30:11.246 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking librdkafka1:amd64 (1.8.0-1build1) ... 2026-03-23T17:30:11.264 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package librgw2. 2026-03-23T17:30:11.268 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../19-librgw2_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-23T17:30:11.269 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking librgw2 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:11.403 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package python3-rgw. 2026-03-23T17:30:11.409 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../20-python3-rgw_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-23T17:30:11.410 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking python3-rgw (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:11.425 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package liboath0:amd64. 2026-03-23T17:30:11.431 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../21-liboath0_2.6.7-3ubuntu0.1_amd64.deb ... 2026-03-23T17:30:11.432 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking liboath0:amd64 (2.6.7-3ubuntu0.1) ... 2026-03-23T17:30:11.445 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package libradosstriper1. 2026-03-23T17:30:11.451 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../22-libradosstriper1_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-23T17:30:11.452 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking libradosstriper1 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:11.469 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package ceph-common. 2026-03-23T17:30:11.474 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../23-ceph-common_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-23T17:30:11.475 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking ceph-common (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:11.868 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package ceph-base. 2026-03-23T17:30:11.873 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../24-ceph-base_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-23T17:30:11.878 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking ceph-base (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:11.969 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package python3-jaraco.functools. 2026-03-23T17:30:11.975 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../25-python3-jaraco.functools_3.4.0-2_all.deb ... 2026-03-23T17:30:11.975 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking python3-jaraco.functools (3.4.0-2) ... 2026-03-23T17:30:11.989 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package python3-cheroot. 2026-03-23T17:30:11.995 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../26-python3-cheroot_8.5.2+ds1-1ubuntu3.1_all.deb ... 2026-03-23T17:30:11.996 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking python3-cheroot (8.5.2+ds1-1ubuntu3.1) ... 2026-03-23T17:30:12.014 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package python3-jaraco.classes. 2026-03-23T17:30:12.020 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../27-python3-jaraco.classes_3.2.1-3_all.deb ... 2026-03-23T17:30:12.021 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking python3-jaraco.classes (3.2.1-3) ... 2026-03-23T17:30:12.035 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package python3-jaraco.text. 2026-03-23T17:30:12.041 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../28-python3-jaraco.text_3.6.0-2_all.deb ... 2026-03-23T17:30:12.042 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking python3-jaraco.text (3.6.0-2) ... 2026-03-23T17:30:12.055 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package python3-jaraco.collections. 2026-03-23T17:30:12.062 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../29-python3-jaraco.collections_3.4.0-2_all.deb ... 2026-03-23T17:30:12.063 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking python3-jaraco.collections (3.4.0-2) ... 2026-03-23T17:30:12.077 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package python3-tempora. 2026-03-23T17:30:12.084 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../30-python3-tempora_4.1.2-1_all.deb ... 2026-03-23T17:30:12.085 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking python3-tempora (4.1.2-1) ... 2026-03-23T17:30:12.101 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package python3-portend. 2026-03-23T17:30:12.107 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../31-python3-portend_3.0.0-1_all.deb ... 2026-03-23T17:30:12.108 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking python3-portend (3.0.0-1) ... 2026-03-23T17:30:12.122 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package python3-zc.lockfile. 2026-03-23T17:30:12.128 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../32-python3-zc.lockfile_2.0-1_all.deb ... 2026-03-23T17:30:12.128 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking python3-zc.lockfile (2.0-1) ... 2026-03-23T17:30:12.143 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package python3-cherrypy3. 2026-03-23T17:30:12.149 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../33-python3-cherrypy3_18.6.1-4_all.deb ... 2026-03-23T17:30:12.150 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking python3-cherrypy3 (18.6.1-4) ... 2026-03-23T17:30:12.180 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package python3-natsort. 2026-03-23T17:30:12.186 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../34-python3-natsort_8.0.2-1_all.deb ... 2026-03-23T17:30:12.187 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking python3-natsort (8.0.2-1) ... 2026-03-23T17:30:12.204 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package ceph-mgr-modules-core. 2026-03-23T17:30:12.209 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../35-ceph-mgr-modules-core_20.2.0-712-g70f8415b-1jammy_all.deb ... 2026-03-23T17:30:12.209 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking ceph-mgr-modules-core (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:12.242 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package libsqlite3-mod-ceph. 2026-03-23T17:30:12.246 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../36-libsqlite3-mod-ceph_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-23T17:30:12.247 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking libsqlite3-mod-ceph (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:12.263 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package ceph-mgr. 2026-03-23T17:30:12.267 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../37-ceph-mgr_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-23T17:30:12.268 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking ceph-mgr (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:12.342 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package ceph-mon. 2026-03-23T17:30:12.346 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../38-ceph-mon_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-23T17:30:12.347 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking ceph-mon (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:12.436 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package libfuse2:amd64. 2026-03-23T17:30:12.443 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../39-libfuse2_2.9.9-5ubuntu3_amd64.deb ... 2026-03-23T17:30:12.443 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking libfuse2:amd64 (2.9.9-5ubuntu3) ... 2026-03-23T17:30:12.463 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package ceph-osd. 2026-03-23T17:30:12.469 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../40-ceph-osd_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-23T17:30:12.470 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking ceph-osd (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:12.716 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package ceph. 2026-03-23T17:30:12.722 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../41-ceph_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-23T17:30:12.722 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking ceph (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:12.738 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package ceph-fuse. 2026-03-23T17:30:12.743 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../42-ceph-fuse_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-23T17:30:12.744 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking ceph-fuse (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:12.772 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package ceph-mds. 2026-03-23T17:30:12.778 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../43-ceph-mds_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-23T17:30:12.779 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking ceph-mds (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:12.821 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package cephadm. 2026-03-23T17:30:12.827 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../44-cephadm_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-23T17:30:12.828 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking cephadm (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:12.846 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package python3-asyncssh. 2026-03-23T17:30:12.852 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../45-python3-asyncssh_2.5.0-1ubuntu0.1_all.deb ... 2026-03-23T17:30:12.853 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking python3-asyncssh (2.5.0-1ubuntu0.1) ... 2026-03-23T17:30:12.880 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package ceph-mgr-cephadm. 2026-03-23T17:30:12.885 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../46-ceph-mgr-cephadm_20.2.0-712-g70f8415b-1jammy_all.deb ... 2026-03-23T17:30:12.886 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking ceph-mgr-cephadm (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:12.910 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package python3-repoze.lru. 2026-03-23T17:30:12.916 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../47-python3-repoze.lru_0.7-2_all.deb ... 2026-03-23T17:30:12.917 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking python3-repoze.lru (0.7-2) ... 2026-03-23T17:30:12.933 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package python3-routes. 2026-03-23T17:30:12.939 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../48-python3-routes_2.5.1-1ubuntu1_all.deb ... 2026-03-23T17:30:12.939 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking python3-routes (2.5.1-1ubuntu1) ... 2026-03-23T17:30:12.961 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package ceph-mgr-dashboard. 2026-03-23T17:30:12.967 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../49-ceph-mgr-dashboard_20.2.0-712-g70f8415b-1jammy_all.deb ... 2026-03-23T17:30:12.968 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking ceph-mgr-dashboard (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:14.632 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package python3-sklearn-lib:amd64. 2026-03-23T17:30:14.638 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../50-python3-sklearn-lib_0.23.2-5ubuntu6_amd64.deb ... 2026-03-23T17:30:14.667 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking python3-sklearn-lib:amd64 (0.23.2-5ubuntu6) ... 2026-03-23T17:30:14.724 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package python3-joblib. 2026-03-23T17:30:14.730 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../51-python3-joblib_0.17.0-4ubuntu1_all.deb ... 2026-03-23T17:30:14.731 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking python3-joblib (0.17.0-4ubuntu1) ... 2026-03-23T17:30:14.766 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package python3-threadpoolctl. 2026-03-23T17:30:14.772 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../52-python3-threadpoolctl_3.1.0-1_all.deb ... 2026-03-23T17:30:14.773 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking python3-threadpoolctl (3.1.0-1) ... 2026-03-23T17:30:14.791 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package python3-sklearn. 2026-03-23T17:30:14.798 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../53-python3-sklearn_0.23.2-5ubuntu6_all.deb ... 2026-03-23T17:30:14.799 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking python3-sklearn (0.23.2-5ubuntu6) ... 2026-03-23T17:30:14.924 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package ceph-mgr-diskprediction-local. 2026-03-23T17:30:14.931 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../54-ceph-mgr-diskprediction-local_20.2.0-712-g70f8415b-1jammy_all.deb ... 2026-03-23T17:30:14.931 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking ceph-mgr-diskprediction-local (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:15.169 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package python3-cachetools. 2026-03-23T17:30:15.175 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../55-python3-cachetools_5.0.0-1_all.deb ... 2026-03-23T17:30:15.175 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking python3-cachetools (5.0.0-1) ... 2026-03-23T17:30:15.191 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package python3-rsa. 2026-03-23T17:30:15.197 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../56-python3-rsa_4.8-1_all.deb ... 2026-03-23T17:30:15.198 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking python3-rsa (4.8-1) ... 2026-03-23T17:30:15.216 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package python3-google-auth. 2026-03-23T17:30:15.223 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../57-python3-google-auth_1.5.1-3_all.deb ... 2026-03-23T17:30:15.224 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking python3-google-auth (1.5.1-3) ... 2026-03-23T17:30:15.244 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package python3-requests-oauthlib. 2026-03-23T17:30:15.250 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../58-python3-requests-oauthlib_1.3.0+ds-0.1_all.deb ... 2026-03-23T17:30:15.251 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking python3-requests-oauthlib (1.3.0+ds-0.1) ... 2026-03-23T17:30:15.268 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package python3-websocket. 2026-03-23T17:30:15.275 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../59-python3-websocket_1.2.3-1_all.deb ... 2026-03-23T17:30:15.276 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking python3-websocket (1.2.3-1) ... 2026-03-23T17:30:15.294 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package python3-kubernetes. 2026-03-23T17:30:15.301 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../60-python3-kubernetes_12.0.1-1ubuntu1_all.deb ... 2026-03-23T17:30:15.302 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking python3-kubernetes (12.0.1-1ubuntu1) ... 2026-03-23T17:30:15.427 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package ceph-mgr-k8sevents. 2026-03-23T17:30:15.433 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../61-ceph-mgr-k8sevents_20.2.0-712-g70f8415b-1jammy_all.deb ... 2026-03-23T17:30:15.433 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking ceph-mgr-k8sevents (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:15.448 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package libonig5:amd64. 2026-03-23T17:30:15.454 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../62-libonig5_6.9.7.1-2build1_amd64.deb ... 2026-03-23T17:30:15.455 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking libonig5:amd64 (6.9.7.1-2build1) ... 2026-03-23T17:30:15.471 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package libjq1:amd64. 2026-03-23T17:30:15.477 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../63-libjq1_1.6-2.1ubuntu3.1_amd64.deb ... 2026-03-23T17:30:15.478 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking libjq1:amd64 (1.6-2.1ubuntu3.1) ... 2026-03-23T17:30:15.491 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package jq. 2026-03-23T17:30:15.497 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../64-jq_1.6-2.1ubuntu3.1_amd64.deb ... 2026-03-23T17:30:15.498 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking jq (1.6-2.1ubuntu3.1) ... 2026-03-23T17:30:15.512 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package socat. 2026-03-23T17:30:15.518 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../65-socat_1.7.4.1-3ubuntu4_amd64.deb ... 2026-03-23T17:30:15.519 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking socat (1.7.4.1-3ubuntu4) ... 2026-03-23T17:30:15.541 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package xmlstarlet. 2026-03-23T17:30:15.547 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../66-xmlstarlet_1.6.1-2.1_amd64.deb ... 2026-03-23T17:30:15.548 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking xmlstarlet (1.6.1-2.1) ... 2026-03-23T17:30:15.593 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package ceph-test. 2026-03-23T17:30:15.600 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../67-ceph-test_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-23T17:30:15.601 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking ceph-test (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:16.846 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package ceph-volume. 2026-03-23T17:30:16.852 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../68-ceph-volume_20.2.0-712-g70f8415b-1jammy_all.deb ... 2026-03-23T17:30:16.853 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking ceph-volume (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:16.879 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package libcephfs-daemon. 2026-03-23T17:30:16.885 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../69-libcephfs-daemon_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-23T17:30:16.886 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking libcephfs-daemon (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:16.901 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package libcephfs-proxy2. 2026-03-23T17:30:16.907 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../70-libcephfs-proxy2_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-23T17:30:16.908 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking libcephfs-proxy2 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:16.922 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package libcephfs-dev. 2026-03-23T17:30:16.928 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../71-libcephfs-dev_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-23T17:30:16.929 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking libcephfs-dev (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:16.948 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package nvme-cli. 2026-03-23T17:30:16.955 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../72-nvme-cli_1.16-3ubuntu0.3_amd64.deb ... 2026-03-23T17:30:16.956 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking nvme-cli (1.16-3ubuntu0.3) ... 2026-03-23T17:30:16.993 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package python-asyncssh-doc. 2026-03-23T17:30:16.999 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../73-python-asyncssh-doc_2.5.0-1ubuntu0.1_all.deb ... 2026-03-23T17:30:17.000 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking python-asyncssh-doc (2.5.0-1ubuntu0.1) ... 2026-03-23T17:30:17.038 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package python3-iniconfig. 2026-03-23T17:30:17.044 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../74-python3-iniconfig_1.1.1-2_all.deb ... 2026-03-23T17:30:17.045 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking python3-iniconfig (1.1.1-2) ... 2026-03-23T17:30:17.060 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package python3-pluggy. 2026-03-23T17:30:17.066 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../75-python3-pluggy_0.13.0-7.1_all.deb ... 2026-03-23T17:30:17.067 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking python3-pluggy (0.13.0-7.1) ... 2026-03-23T17:30:17.084 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package python3-psutil. 2026-03-23T17:30:17.091 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../76-python3-psutil_5.9.0-1build1_amd64.deb ... 2026-03-23T17:30:17.091 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking python3-psutil (5.9.0-1build1) ... 2026-03-23T17:30:17.113 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package python3-py. 2026-03-23T17:30:17.119 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../77-python3-py_1.10.0-1_all.deb ... 2026-03-23T17:30:17.120 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking python3-py (1.10.0-1) ... 2026-03-23T17:30:17.143 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package python3-pygments. 2026-03-23T17:30:17.150 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../78-python3-pygments_2.11.2+dfsg-2ubuntu0.1_all.deb ... 2026-03-23T17:30:17.151 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking python3-pygments (2.11.2+dfsg-2ubuntu0.1) ... 2026-03-23T17:30:17.210 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package python3-toml. 2026-03-23T17:30:17.216 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../79-python3-toml_0.10.2-1_all.deb ... 2026-03-23T17:30:17.217 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking python3-toml (0.10.2-1) ... 2026-03-23T17:30:17.233 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package python3-pytest. 2026-03-23T17:30:17.239 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../80-python3-pytest_6.2.5-1ubuntu2_all.deb ... 2026-03-23T17:30:17.240 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking python3-pytest (6.2.5-1ubuntu2) ... 2026-03-23T17:30:17.279 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package python3-simplejson. 2026-03-23T17:30:17.285 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../81-python3-simplejson_3.17.6-1build1_amd64.deb ... 2026-03-23T17:30:17.286 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking python3-simplejson (3.17.6-1build1) ... 2026-03-23T17:30:17.305 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package python3-webob. 2026-03-23T17:30:17.311 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../82-python3-webob_1%3a1.8.6-1.1ubuntu0.1_all.deb ... 2026-03-23T17:30:17.312 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking python3-webob (1:1.8.6-1.1ubuntu0.1) ... 2026-03-23T17:30:17.331 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package qttranslations5-l10n. 2026-03-23T17:30:17.337 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../83-qttranslations5-l10n_5.15.3-1_all.deb ... 2026-03-23T17:30:17.338 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking qttranslations5-l10n (5.15.3-1) ... 2026-03-23T17:30:17.437 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package radosgw. 2026-03-23T17:30:17.443 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../84-radosgw_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-23T17:30:17.444 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking radosgw (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:17.739 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package rbd-fuse. 2026-03-23T17:30:17.743 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../85-rbd-fuse_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-23T17:30:17.744 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking rbd-fuse (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:17.764 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package smartmontools. 2026-03-23T17:30:17.767 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../86-smartmontools_7.2-1ubuntu0.1_amd64.deb ... 2026-03-23T17:30:17.775 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking smartmontools (7.2-1ubuntu0.1) ... 2026-03-23T17:30:17.814 INFO:teuthology.orchestra.run.vm04.stdout:Setting up smartmontools (7.2-1ubuntu0.1) ... 2026-03-23T17:30:18.045 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/smartd.service → /lib/systemd/system/smartmontools.service. 2026-03-23T17:30:18.045 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/smartmontools.service → /lib/systemd/system/smartmontools.service. 2026-03-23T17:30:18.359 INFO:teuthology.orchestra.run.vm04.stdout:Setting up python3-iniconfig (1.1.1-2) ... 2026-03-23T17:30:18.431 INFO:teuthology.orchestra.run.vm04.stdout:Setting up libdouble-conversion3:amd64 (3.1.7-4) ... 2026-03-23T17:30:18.433 INFO:teuthology.orchestra.run.vm04.stdout:Setting up nvme-cli (1.16-3ubuntu0.3) ... 2026-03-23T17:30:18.495 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/default.target.wants/nvmefc-boot-connections.service → /lib/systemd/system/nvmefc-boot-connections.service. 2026-03-23T17:30:18.709 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/default.target.wants/nvmf-autoconnect.service → /lib/systemd/system/nvmf-autoconnect.service. 2026-03-23T17:30:19.060 INFO:teuthology.orchestra.run.vm04.stdout:nvmf-connect.target is a disabled or a static unit, not starting it. 2026-03-23T17:30:19.076 INFO:teuthology.orchestra.run.vm04.stdout:Setting up cephadm (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:19.121 INFO:teuthology.orchestra.run.vm04.stdout:Adding system user cephadm....done 2026-03-23T17:30:19.131 INFO:teuthology.orchestra.run.vm04.stdout:Setting up python3-jaraco.classes (3.2.1-3) ... 2026-03-23T17:30:19.197 INFO:teuthology.orchestra.run.vm04.stdout:Setting up python-asyncssh-doc (2.5.0-1ubuntu0.1) ... 2026-03-23T17:30:19.199 INFO:teuthology.orchestra.run.vm04.stdout:Setting up python3-jaraco.functools (3.4.0-2) ... 2026-03-23T17:30:19.262 INFO:teuthology.orchestra.run.vm04.stdout:Setting up python3-repoze.lru (0.7-2) ... 2026-03-23T17:30:19.329 INFO:teuthology.orchestra.run.vm04.stdout:Setting up liboath0:amd64 (2.6.7-3ubuntu0.1) ... 2026-03-23T17:30:19.331 INFO:teuthology.orchestra.run.vm04.stdout:Setting up python3-py (1.10.0-1) ... 2026-03-23T17:30:19.416 INFO:teuthology.orchestra.run.vm04.stdout:Setting up python3-joblib (0.17.0-4ubuntu1) ... 2026-03-23T17:30:19.556 INFO:teuthology.orchestra.run.vm04.stdout:Setting up python3-cachetools (5.0.0-1) ... 2026-03-23T17:30:19.626 INFO:teuthology.orchestra.run.vm04.stdout:Setting up python3-threadpoolctl (3.1.0-1) ... 2026-03-23T17:30:19.723 INFO:teuthology.orchestra.run.vm04.stdout:Setting up python3-ceph-argparse (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:19.810 INFO:teuthology.orchestra.run.vm04.stdout:Setting up python3-sklearn-lib:amd64 (0.23.2-5ubuntu6) ... 2026-03-23T17:30:19.812 INFO:teuthology.orchestra.run.vm04.stdout:Setting up libnbd0 (1.10.5-1) ... 2026-03-23T17:30:19.814 INFO:teuthology.orchestra.run.vm04.stdout:Setting up libfuse2:amd64 (2.9.9-5ubuntu3) ... 2026-03-23T17:30:19.816 INFO:teuthology.orchestra.run.vm04.stdout:Setting up libpcre2-16-0:amd64 (10.39-3ubuntu0.1) ... 2026-03-23T17:30:19.818 INFO:teuthology.orchestra.run.vm04.stdout:Setting up python3-psutil (5.9.0-1build1) ... 2026-03-23T17:30:19.941 INFO:teuthology.orchestra.run.vm04.stdout:Setting up python3-natsort (8.0.2-1) ... 2026-03-23T17:30:20.011 INFO:teuthology.orchestra.run.vm04.stdout:Setting up libcephfs-proxy2 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:20.013 INFO:teuthology.orchestra.run.vm04.stdout:Setting up python3-routes (2.5.1-1ubuntu1) ... 2026-03-23T17:30:20.158 INFO:teuthology.orchestra.run.vm04.stdout:Setting up python3-simplejson (3.17.6-1build1) ... 2026-03-23T17:30:20.238 INFO:teuthology.orchestra.run.vm04.stdout:Setting up python3-pygments (2.11.2+dfsg-2ubuntu0.1) ... 2026-03-23T17:30:20.503 INFO:teuthology.orchestra.run.vm04.stdout:Setting up qttranslations5-l10n (5.15.3-1) ... 2026-03-23T17:30:20.505 INFO:teuthology.orchestra.run.vm04.stdout:Setting up python3-wcwidth (0.2.5+dfsg1-1) ... 2026-03-23T17:30:20.596 INFO:teuthology.orchestra.run.vm04.stdout:Setting up python3-asyncssh (2.5.0-1ubuntu0.1) ... 2026-03-23T17:30:20.732 INFO:teuthology.orchestra.run.vm04.stdout:Setting up python3-cheroot (8.5.2+ds1-1ubuntu3.1) ... 2026-03-23T17:30:20.820 INFO:teuthology.orchestra.run.vm04.stdout:Setting up python3-jaraco.text (3.6.0-2) ... 2026-03-23T17:30:20.890 INFO:teuthology.orchestra.run.vm04.stdout:Setting up socat (1.7.4.1-3ubuntu4) ... 2026-03-23T17:30:20.893 INFO:teuthology.orchestra.run.vm04.stdout:Setting up python3-ceph-common (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:20.987 INFO:teuthology.orchestra.run.vm04.stdout:Setting up python3-sklearn (0.23.2-5ubuntu6) ... 2026-03-23T17:30:21.542 INFO:teuthology.orchestra.run.vm04.stdout:Setting up libqt5core5a:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-23T17:30:21.547 INFO:teuthology.orchestra.run.vm04.stdout:Setting up python3-toml (0.10.2-1) ... 2026-03-23T17:30:21.629 INFO:teuthology.orchestra.run.vm04.stdout:Setting up librdkafka1:amd64 (1.8.0-1build1) ... 2026-03-23T17:30:21.632 INFO:teuthology.orchestra.run.vm04.stdout:Setting up xmlstarlet (1.6.1-2.1) ... 2026-03-23T17:30:21.634 INFO:teuthology.orchestra.run.vm04.stdout:Setting up python3-pluggy (0.13.0-7.1) ... 2026-03-23T17:30:21.706 INFO:teuthology.orchestra.run.vm04.stdout:Setting up python3-zc.lockfile (2.0-1) ... 2026-03-23T17:30:21.774 INFO:teuthology.orchestra.run.vm04.stdout:Setting up libqt5dbus5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-23T17:30:21.776 INFO:teuthology.orchestra.run.vm04.stdout:Setting up python3-rsa (4.8-1) ... 2026-03-23T17:30:21.853 INFO:teuthology.orchestra.run.vm04.stdout:Setting up python3-tempora (4.1.2-1) ... 2026-03-23T17:30:21.925 INFO:teuthology.orchestra.run.vm04.stdout:Setting up python3-prettytable (2.5.0-2) ... 2026-03-23T17:30:21.999 INFO:teuthology.orchestra.run.vm04.stdout:Setting up liblttng-ust1:amd64 (2.13.1-1ubuntu1) ... 2026-03-23T17:30:22.001 INFO:teuthology.orchestra.run.vm04.stdout:Setting up python3-websocket (1.2.3-1) ... 2026-03-23T17:30:22.124 INFO:teuthology.orchestra.run.vm04.stdout:Setting up libonig5:amd64 (6.9.7.1-2build1) ... 2026-03-23T17:30:22.128 INFO:teuthology.orchestra.run.vm04.stdout:Setting up python3-requests-oauthlib (1.3.0+ds-0.1) ... 2026-03-23T17:30:22.199 INFO:teuthology.orchestra.run.vm04.stdout:Setting up python3-webob (1:1.8.6-1.1ubuntu0.1) ... 2026-03-23T17:30:22.287 INFO:teuthology.orchestra.run.vm04.stdout:Setting up python3-jaraco.collections (3.4.0-2) ... 2026-03-23T17:30:22.354 INFO:teuthology.orchestra.run.vm04.stdout:Setting up libjq1:amd64 (1.6-2.1ubuntu3.1) ... 2026-03-23T17:30:22.357 INFO:teuthology.orchestra.run.vm04.stdout:Setting up python3-pytest (6.2.5-1ubuntu2) ... 2026-03-23T17:30:22.494 INFO:teuthology.orchestra.run.vm04.stdout:Setting up python3-portend (3.0.0-1) ... 2026-03-23T17:30:22.562 INFO:teuthology.orchestra.run.vm04.stdout:Setting up libqt5network5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-23T17:30:22.564 INFO:teuthology.orchestra.run.vm04.stdout:Setting up python3-google-auth (1.5.1-3) ... 2026-03-23T17:30:22.648 INFO:teuthology.orchestra.run.vm04.stdout:Setting up jq (1.6-2.1ubuntu3.1) ... 2026-03-23T17:30:22.650 INFO:teuthology.orchestra.run.vm04.stdout:Setting up python3-cherrypy3 (18.6.1-4) ... 2026-03-23T17:30:22.783 INFO:teuthology.orchestra.run.vm04.stdout:Setting up libthrift-0.16.0:amd64 (0.16.0-2) ... 2026-03-23T17:30:22.785 INFO:teuthology.orchestra.run.vm04.stdout:Setting up librados2 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:22.787 INFO:teuthology.orchestra.run.vm04.stdout:Setting up librgw2 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:22.789 INFO:teuthology.orchestra.run.vm04.stdout:Setting up libsqlite3-mod-ceph (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:22.791 INFO:teuthology.orchestra.run.vm04.stdout:Setting up python3-kubernetes (12.0.1-1ubuntu1) ... 2026-03-23T17:30:23.349 INFO:teuthology.orchestra.run.vm04.stdout:Setting up libcephfs2 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:23.352 INFO:teuthology.orchestra.run.vm04.stdout:Setting up libradosstriper1 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:23.354 INFO:teuthology.orchestra.run.vm04.stdout:Setting up librbd1 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:23.356 INFO:teuthology.orchestra.run.vm04.stdout:Setting up ceph-mgr-modules-core (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:23.358 INFO:teuthology.orchestra.run.vm04.stdout:Setting up ceph-fuse (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:23.420 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/remote-fs.target.wants/ceph-fuse.target → /lib/systemd/system/ceph-fuse.target. 2026-03-23T17:30:23.420 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-fuse.target → /lib/systemd/system/ceph-fuse.target. 2026-03-23T17:30:23.774 INFO:teuthology.orchestra.run.vm04.stdout:Setting up libcephfs-dev (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:23.784 INFO:teuthology.orchestra.run.vm04.stdout:Setting up python3-rados (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:23.786 INFO:teuthology.orchestra.run.vm04.stdout:Setting up libcephfs-daemon (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:23.788 INFO:teuthology.orchestra.run.vm04.stdout:Setting up python3-rbd (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:23.790 INFO:teuthology.orchestra.run.vm04.stdout:Setting up rbd-fuse (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:23.792 INFO:teuthology.orchestra.run.vm04.stdout:Setting up python3-rgw (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:23.794 INFO:teuthology.orchestra.run.vm04.stdout:Setting up python3-cephfs (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:23.796 INFO:teuthology.orchestra.run.vm04.stdout:Setting up ceph-common (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:23.829 INFO:teuthology.orchestra.run.vm04.stdout:Adding group ceph....done 2026-03-23T17:30:23.866 INFO:teuthology.orchestra.run.vm04.stdout:Adding system user ceph....done 2026-03-23T17:30:23.877 INFO:teuthology.orchestra.run.vm04.stdout:Setting system user ceph properties....done 2026-03-23T17:30:23.881 INFO:teuthology.orchestra.run.vm04.stdout:chown: cannot access '/var/log/ceph/*.log*': No such file or directory 2026-03-23T17:30:23.952 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph.target → /lib/systemd/system/ceph.target. 2026-03-23T17:30:24.177 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/rbdmap.service → /lib/systemd/system/rbdmap.service. 2026-03-23T17:30:24.544 INFO:teuthology.orchestra.run.vm04.stdout:Setting up ceph-test (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:24.546 INFO:teuthology.orchestra.run.vm04.stdout:Setting up radosgw (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:24.775 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-radosgw.target → /lib/systemd/system/ceph-radosgw.target. 2026-03-23T17:30:24.775 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-radosgw.target → /lib/systemd/system/ceph-radosgw.target. 2026-03-23T17:30:25.122 INFO:teuthology.orchestra.run.vm04.stdout:Setting up ceph-base (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:25.214 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-crash.service → /lib/systemd/system/ceph-crash.service. 2026-03-23T17:30:25.665 INFO:teuthology.orchestra.run.vm04.stdout:Setting up ceph-mds (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:25.828 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mds.target → /lib/systemd/system/ceph-mds.target. 2026-03-23T17:30:25.828 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mds.target → /lib/systemd/system/ceph-mds.target. 2026-03-23T17:30:26.241 INFO:teuthology.orchestra.run.vm04.stdout:Setting up ceph-mgr (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:26.591 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mgr.target → /lib/systemd/system/ceph-mgr.target. 2026-03-23T17:30:26.591 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mgr.target → /lib/systemd/system/ceph-mgr.target. 2026-03-23T17:30:26.964 INFO:teuthology.orchestra.run.vm04.stdout:Setting up ceph-osd (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:27.039 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-osd.target → /lib/systemd/system/ceph-osd.target. 2026-03-23T17:30:27.039 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-osd.target → /lib/systemd/system/ceph-osd.target. 2026-03-23T17:30:27.380 INFO:teuthology.orchestra.run.vm04.stdout:Setting up ceph-mgr-k8sevents (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:27.423 INFO:teuthology.orchestra.run.vm04.stdout:Setting up ceph-mgr-diskprediction-local (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:27.475 INFO:teuthology.orchestra.run.vm04.stdout:Setting up ceph-mon (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:27.542 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mon.target → /lib/systemd/system/ceph-mon.target. 2026-03-23T17:30:27.542 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mon.target → /lib/systemd/system/ceph-mon.target. 2026-03-23T17:30:27.881 INFO:teuthology.orchestra.run.vm04.stdout:Setting up ceph-mgr-cephadm (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:27.896 INFO:teuthology.orchestra.run.vm04.stdout:Setting up ceph (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:27.899 INFO:teuthology.orchestra.run.vm04.stdout:Setting up ceph-mgr-dashboard (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:27.911 INFO:teuthology.orchestra.run.vm04.stdout:Setting up ceph-volume (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T17:30:28.027 INFO:teuthology.orchestra.run.vm04.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-23T17:30:28.103 INFO:teuthology.orchestra.run.vm04.stdout:Processing triggers for libc-bin (2.35-0ubuntu3.13) ... 2026-03-23T17:30:28.396 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:28.396 INFO:teuthology.orchestra.run.vm04.stdout:Running kernel seems to be up-to-date. 2026-03-23T17:30:28.396 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:28.396 INFO:teuthology.orchestra.run.vm04.stdout:Services to be restarted: 2026-03-23T17:30:28.398 INFO:teuthology.orchestra.run.vm04.stdout: systemctl restart apache-htcacheclean.service 2026-03-23T17:30:28.404 INFO:teuthology.orchestra.run.vm04.stdout: systemctl restart rsyslog.service 2026-03-23T17:30:28.408 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:28.408 INFO:teuthology.orchestra.run.vm04.stdout:Service restarts being deferred: 2026-03-23T17:30:28.408 INFO:teuthology.orchestra.run.vm04.stdout: systemctl restart networkd-dispatcher.service 2026-03-23T17:30:28.408 INFO:teuthology.orchestra.run.vm04.stdout: systemctl restart unattended-upgrades.service 2026-03-23T17:30:28.408 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:28.408 INFO:teuthology.orchestra.run.vm04.stdout:No containers need to be restarted. 2026-03-23T17:30:28.408 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:28.408 INFO:teuthology.orchestra.run.vm04.stdout:No user sessions are running outdated binaries. 2026-03-23T17:30:28.408 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:28.408 INFO:teuthology.orchestra.run.vm04.stdout:No VM guests are running outdated hypervisor (qemu) binaries on this host. 2026-03-23T17:30:29.211 INFO:teuthology.orchestra.run.vm04.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-23T17:30:29.214 DEBUG:teuthology.orchestra.run.vm04:> sudo DEBIAN_FRONTEND=noninteractive apt-get -y --force-yes -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" install python3-jmespath python3-xmltodict s3cmd 2026-03-23T17:30:29.291 INFO:teuthology.orchestra.run.vm04.stdout:Reading package lists... 2026-03-23T17:30:29.452 INFO:teuthology.orchestra.run.vm04.stdout:Building dependency tree... 2026-03-23T17:30:29.452 INFO:teuthology.orchestra.run.vm04.stdout:Reading state information... 2026-03-23T17:30:29.571 INFO:teuthology.orchestra.run.vm04.stdout:The following packages were automatically installed and are no longer required: 2026-03-23T17:30:29.571 INFO:teuthology.orchestra.run.vm04.stdout: kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-23T17:30:29.571 INFO:teuthology.orchestra.run.vm04.stdout: libsgutils2-2 sg3-utils sg3-utils-udev 2026-03-23T17:30:29.572 INFO:teuthology.orchestra.run.vm04.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-23T17:30:29.585 INFO:teuthology.orchestra.run.vm04.stdout:The following NEW packages will be installed: 2026-03-23T17:30:29.585 INFO:teuthology.orchestra.run.vm04.stdout: python3-jmespath python3-xmltodict s3cmd 2026-03-23T17:30:29.824 INFO:teuthology.orchestra.run.vm04.stdout:0 upgraded, 3 newly installed, 0 to remove and 37 not upgraded. 2026-03-23T17:30:29.824 INFO:teuthology.orchestra.run.vm04.stdout:Need to get 155 kB of archives. 2026-03-23T17:30:29.824 INFO:teuthology.orchestra.run.vm04.stdout:After this operation, 678 kB of additional disk space will be used. 2026-03-23T17:30:29.824 INFO:teuthology.orchestra.run.vm04.stdout:Get:1 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-jmespath all 0.10.0-1 [21.7 kB] 2026-03-23T17:30:30.042 INFO:teuthology.orchestra.run.vm04.stdout:Get:2 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-xmltodict all 0.12.0-2 [12.6 kB] 2026-03-23T17:30:30.067 INFO:teuthology.orchestra.run.vm04.stdout:Get:3 http://archive.ubuntu.com/ubuntu jammy/universe amd64 s3cmd all 2.2.0-1 [120 kB] 2026-03-23T17:30:30.484 INFO:teuthology.orchestra.run.vm04.stdout:Fetched 155 kB in 1s (220 kB/s) 2026-03-23T17:30:30.501 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package python3-jmespath. 2026-03-23T17:30:30.538 INFO:teuthology.orchestra.run.vm04.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 126082 files and directories currently installed.) 2026-03-23T17:30:30.541 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../python3-jmespath_0.10.0-1_all.deb ... 2026-03-23T17:30:30.542 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking python3-jmespath (0.10.0-1) ... 2026-03-23T17:30:30.561 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package python3-xmltodict. 2026-03-23T17:30:30.569 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../python3-xmltodict_0.12.0-2_all.deb ... 2026-03-23T17:30:30.570 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking python3-xmltodict (0.12.0-2) ... 2026-03-23T17:30:30.587 INFO:teuthology.orchestra.run.vm04.stdout:Selecting previously unselected package s3cmd. 2026-03-23T17:30:30.594 INFO:teuthology.orchestra.run.vm04.stdout:Preparing to unpack .../archives/s3cmd_2.2.0-1_all.deb ... 2026-03-23T17:30:30.595 INFO:teuthology.orchestra.run.vm04.stdout:Unpacking s3cmd (2.2.0-1) ... 2026-03-23T17:30:30.628 INFO:teuthology.orchestra.run.vm04.stdout:Setting up s3cmd (2.2.0-1) ... 2026-03-23T17:30:30.730 INFO:teuthology.orchestra.run.vm04.stdout:Setting up python3-xmltodict (0.12.0-2) ... 2026-03-23T17:30:30.808 INFO:teuthology.orchestra.run.vm04.stdout:Setting up python3-jmespath (0.10.0-1) ... 2026-03-23T17:30:30.892 INFO:teuthology.orchestra.run.vm04.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-23T17:30:31.234 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:31.234 INFO:teuthology.orchestra.run.vm04.stdout:Running kernel seems to be up-to-date. 2026-03-23T17:30:31.234 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:31.234 INFO:teuthology.orchestra.run.vm04.stdout:Services to be restarted: 2026-03-23T17:30:31.238 INFO:teuthology.orchestra.run.vm04.stdout: systemctl restart apache-htcacheclean.service 2026-03-23T17:30:31.245 INFO:teuthology.orchestra.run.vm04.stdout: systemctl restart rsyslog.service 2026-03-23T17:30:31.248 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:31.248 INFO:teuthology.orchestra.run.vm04.stdout:Service restarts being deferred: 2026-03-23T17:30:31.249 INFO:teuthology.orchestra.run.vm04.stdout: systemctl restart networkd-dispatcher.service 2026-03-23T17:30:31.249 INFO:teuthology.orchestra.run.vm04.stdout: systemctl restart unattended-upgrades.service 2026-03-23T17:30:31.249 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:31.249 INFO:teuthology.orchestra.run.vm04.stdout:No containers need to be restarted. 2026-03-23T17:30:31.249 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:31.249 INFO:teuthology.orchestra.run.vm04.stdout:No user sessions are running outdated binaries. 2026-03-23T17:30:31.249 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:31.249 INFO:teuthology.orchestra.run.vm04.stdout:No VM guests are running outdated hypervisor (qemu) binaries on this host. 2026-03-23T17:30:32.114 INFO:teuthology.orchestra.run.vm04.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-23T17:30:32.118 DEBUG:teuthology.parallel:result is None 2026-03-23T17:30:32.118 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=ubuntu%2F22.04%2Fx86_64&sha1=70f8415b300f041766fa27faf7d5472699e32388 2026-03-23T17:30:32.699 DEBUG:teuthology.orchestra.run.vm04:> dpkg-query -W -f '${Version}' ceph 2026-03-23T17:30:32.708 INFO:teuthology.orchestra.run.vm04.stdout:20.2.0-712-g70f8415b-1jammy 2026-03-23T17:30:32.708 INFO:teuthology.packaging:The installed version of ceph is 20.2.0-712-g70f8415b-1jammy 2026-03-23T17:30:32.708 INFO:teuthology.task.install:The correct ceph version 20.2.0-712-g70f8415b-1jammy is installed. 2026-03-23T17:30:32.709 INFO:teuthology.task.install.util:Shipping valgrind.supp... 2026-03-23T17:30:32.709 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-23T17:30:32.709 DEBUG:teuthology.orchestra.run.vm04:> sudo dd of=/home/ubuntu/cephtest/valgrind.supp 2026-03-23T17:30:32.760 INFO:teuthology.task.install.util:Shipping 'daemon-helper'... 2026-03-23T17:30:32.760 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-23T17:30:32.760 DEBUG:teuthology.orchestra.run.vm04:> sudo dd of=/usr/bin/daemon-helper 2026-03-23T17:30:32.812 DEBUG:teuthology.orchestra.run.vm04:> sudo chmod a=rx -- /usr/bin/daemon-helper 2026-03-23T17:30:32.860 INFO:teuthology.task.install.util:Shipping 'adjust-ulimits'... 2026-03-23T17:30:32.860 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-23T17:30:32.860 DEBUG:teuthology.orchestra.run.vm04:> sudo dd of=/usr/bin/adjust-ulimits 2026-03-23T17:30:32.912 DEBUG:teuthology.orchestra.run.vm04:> sudo chmod a=rx -- /usr/bin/adjust-ulimits 2026-03-23T17:30:32.963 INFO:teuthology.task.install.util:Shipping 'stdin-killer'... 2026-03-23T17:30:32.963 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-23T17:30:32.963 DEBUG:teuthology.orchestra.run.vm04:> sudo dd of=/usr/bin/stdin-killer 2026-03-23T17:30:33.011 DEBUG:teuthology.orchestra.run.vm04:> sudo chmod a=rx -- /usr/bin/stdin-killer 2026-03-23T17:30:33.059 INFO:teuthology.run_tasks:Running task ceph... 2026-03-23T17:30:33.101 INFO:tasks.ceph:Making ceph log dir writeable by non-root... 2026-03-23T17:30:33.101 DEBUG:teuthology.orchestra.run.vm04:> sudo chmod 777 /var/log/ceph 2026-03-23T17:30:33.109 INFO:tasks.ceph:Disabling ceph logrotate... 2026-03-23T17:30:33.109 DEBUG:teuthology.orchestra.run.vm04:> sudo rm -f -- /etc/logrotate.d/ceph 2026-03-23T17:30:33.159 INFO:tasks.ceph:Creating extra log directories... 2026-03-23T17:30:33.159 DEBUG:teuthology.orchestra.run.vm04:> sudo install -d -m0777 -- /var/log/ceph/valgrind /var/log/ceph/profiling-logger 2026-03-23T17:30:33.210 INFO:tasks.ceph:Creating ceph cluster ceph... 2026-03-23T17:30:33.210 INFO:tasks.ceph:config {'conf': {'client': {'rbd default data pool': 'datapool', 'rbd default features': 61}, 'global': {'mon client directed command retry': 5, 'mon warn on pool no app': False, 'ms inject socket failures': 5000}, 'mgr': {'debug mgr': 20, 'debug ms': 1}, 'mon': {'debug mon': 20, 'debug ms': 1, 'debug paxos': 20}, 'osd': {'bluestore block size': 96636764160, 'bluestore compression algorithm': 'zstd', 'bluestore compression mode': 'aggressive', 'bluestore fsck on mount': True, 'debug bluefs': '1/20', 'debug bluestore': '1/20', 'debug ms': 1, 'debug osd': 20, 'debug rocksdb': '4/10', 'enable experimental unrecoverable data corrupting features': '*', 'mon osd backfillfull_ratio': 0.85, 'mon osd full ratio': 0.9, 'mon osd nearfull ratio': 0.8, 'osd debug randomize hobject sort order': False, 'osd failsafe full ratio': 0.95, 'osd mclock iops capacity threshold hdd': 49000, 'osd objectstore': 'bluestore', 'osd shutdown pgref assert': True}}, 'fs': 'xfs', 'mkfs_options': None, 'mount_options': None, 'skip_mgr_daemons': False, 'log_ignorelist': ['\\(MDS_ALL_DOWN\\)', '\\(MDS_UP_LESS_THAN_MAX\\)', '\\(OSD_SLOW_PING_TIME'], 'cpu_profile': set(), 'cluster': 'ceph', 'mon_bind_msgr2': True, 'mon_bind_addrvec': True} 2026-03-23T17:30:33.210 INFO:tasks.ceph:ctx.config {'archive_path': '/archive/kyr-2026-03-20_22:04:26-rbd-tentacle-none-default-vps/3501', 'branch': 'tentacle', 'description': 'rbd/cli/{base/install clusters/{fixed-1} conf/{disable-pool-app} data-pool/ec features/defaults msgr-failures/few objectstore/bluestore-comp-zstd supported-random-distro$/{ubuntu_latest} workloads/rbd_cli_generic}', 'email': None, 'first_in_suite': False, 'flavor': 'default', 'job_id': '3501', 'ktype': 'distro', 'last_in_suite': False, 'machine_type': 'vps', 'name': 'kyr-2026-03-20_22:04:26-rbd-tentacle-none-default-vps', 'no_nested_subset': False, 'os_type': 'ubuntu', 'os_version': '22.04', 'overrides': {'admin_socket': {'branch': 'tentacle'}, 'ansible.cephlab': {'branch': 'main', 'repo': 'https://github.com/kshtsk/ceph-cm-ansible.git', 'skip_tags': 'nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs', 'vars': {'logical_volumes': {'lv_1': {'scratch_dev': True, 'size': '25%VG', 'vg': 'vg_nvme'}, 'lv_2': {'scratch_dev': True, 'size': '25%VG', 'vg': 'vg_nvme'}, 'lv_3': {'scratch_dev': True, 'size': '25%VG', 'vg': 'vg_nvme'}, 'lv_4': {'scratch_dev': True, 'size': '25%VG', 'vg': 'vg_nvme'}}, 'timezone': 'UTC', 'volume_groups': {'vg_nvme': {'pvs': '/dev/vdb,/dev/vdc,/dev/vdd,/dev/vde'}}}}, 'ceph': {'conf': {'client': {'rbd default data pool': 'datapool', 'rbd default features': 61}, 'global': {'mon client directed command retry': 5, 'mon warn on pool no app': False, 'ms inject socket failures': 5000}, 'mgr': {'debug mgr': 20, 'debug ms': 1}, 'mon': {'debug mon': 20, 'debug ms': 1, 'debug paxos': 20}, 'osd': {'bluestore block size': 96636764160, 'bluestore compression algorithm': 'zstd', 'bluestore compression mode': 'aggressive', 'bluestore fsck on mount': True, 'debug bluefs': '1/20', 'debug bluestore': '1/20', 'debug ms': 1, 'debug osd': 20, 'debug rocksdb': '4/10', 'enable experimental unrecoverable data corrupting features': '*', 'mon osd backfillfull_ratio': 0.85, 'mon osd full ratio': 0.9, 'mon osd nearfull ratio': 0.8, 'osd debug randomize hobject sort order': False, 'osd failsafe full ratio': 0.95, 'osd mclock iops capacity threshold hdd': 49000, 'osd objectstore': 'bluestore', 'osd shutdown pgref assert': True}}, 'flavor': 'default', 'fs': 'xfs', 'log-ignorelist': ['\\(MDS_ALL_DOWN\\)', '\\(MDS_UP_LESS_THAN_MAX\\)', '\\(OSD_SLOW_PING_TIME'], 'sha1': '70f8415b300f041766fa27faf7d5472699e32388'}, 'ceph-deploy': {'conf': {'client': {'log file': '/var/log/ceph/ceph-$name.$pid.log'}, 'global': {'osd crush chooseleaf type': 0, 'osd pool default pg num': 128, 'osd pool default pgp num': 128, 'osd pool default size': 2}, 'mon': {}}}, 'cephadm': {'cephadm_binary_url': 'https://download.ceph.com/rpm-20.2.0/el9/noarch/cephadm'}, 'install': {'ceph': {'flavor': 'default', 'sha1': '70f8415b300f041766fa27faf7d5472699e32388'}, 'extra_system_packages': {'deb': ['python3-jmespath', 'python3-xmltodict', 's3cmd'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-jmespath', 'python3-xmltodict', 's3cmd']}}, 'thrashosds': {'bdev_inject_crash': 2, 'bdev_inject_crash_probability': 0.5}, 'workunit': {'branch': 'tt-tentacle', 'sha1': '0392f78529848ec72469e8e431875cb98d3a5fb4'}}, 'owner': 'kyr', 'priority': 1000, 'repo': 'https://github.com/ceph/ceph.git', 'roles': [['mon.a', 'mgr.x', 'osd.0', 'osd.1', 'osd.2', 'client.0']], 'seed': 3051, 'sha1': '70f8415b300f041766fa27faf7d5472699e32388', 'sleep_before_teardown': 0, 'subset': '1/128', 'suite': 'rbd', 'suite_branch': 'tt-tentacle', 'suite_path': '/home/teuthos/src/github.com_kshtsk_ceph_0392f78529848ec72469e8e431875cb98d3a5fb4/qa', 'suite_relpath': 'qa', 'suite_repo': 'https://github.com/kshtsk/ceph.git', 'suite_sha1': '0392f78529848ec72469e8e431875cb98d3a5fb4', 'targets': {'vm04.local': 'ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBJoXz0D/Nu8/hzq2ZZlacEgnDdWCoJRGl3xE/MAUqns0Wim/v+eJZAklj7iE3Nx/DuH33O/sKetP0pALV/8LkE='}, 'tasks': [{'internal.check_packages': None}, {'internal.buildpackages_prep': None}, {'internal.save_config': None}, {'internal.check_lock': None}, {'internal.add_remotes': None}, {'console_log': None}, {'internal.connect': None}, {'internal.push_inventory': None}, {'internal.serialize_remote_roles': None}, {'internal.check_conflict': None}, {'internal.check_ceph_data': None}, {'internal.vm_setup': None}, {'internal.base': None}, {'internal.archive_upload': None}, {'internal.archive': None}, {'internal.coredump': None}, {'internal.sudo': None}, {'internal.syslog': None}, {'internal.timer': None}, {'pcp': None}, {'selinux': None}, {'ansible.cephlab': None}, {'clock': None}, {'install': None}, {'ceph': None}, {'exec': {'client.0': ['sudo ceph osd erasure-code-profile set teuthologyprofile crush-failure-domain=osd m=1 k=2', 'sudo ceph osd pool create datapool 4 4 erasure teuthologyprofile', 'sudo ceph osd pool set datapool allow_ec_overwrites true', 'rbd pool init datapool']}}, {'workunit': {'clients': {'client.0': ['rbd/cli_generic.sh']}}}], 'teuthology': {'fragments_dropped': [], 'meta': {}, 'postmerge': []}, 'teuthology_branch': 'clyso-debian-13', 'teuthology_repo': 'https://github.com/clyso/teuthology', 'teuthology_sha1': '1c580df7a9c7c2aadc272da296344fd99f27c444', 'timestamp': '2026-03-20_22:04:26', 'tube': 'vps', 'user': 'kyr', 'verbose': False, 'worker_log': '/home/teuthos/.teuthology/dispatcher/dispatcher.vps.2366871'} 2026-03-23T17:30:33.210 DEBUG:teuthology.orchestra.run.vm04:> install -d -m0755 -- /home/ubuntu/cephtest/ceph.data 2026-03-23T17:30:33.251 DEBUG:teuthology.orchestra.run.vm04:> sudo install -d -m0777 -- /var/run/ceph 2026-03-23T17:30:33.300 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-23T17:30:33.300 DEBUG:teuthology.orchestra.run.vm04:> dd if=/scratch_devs of=/dev/stdout 2026-03-23T17:30:33.347 DEBUG:teuthology.misc:devs=['/dev/vg_nvme/lv_1', '/dev/vg_nvme/lv_2', '/dev/vg_nvme/lv_3', '/dev/vg_nvme/lv_4'] 2026-03-23T17:30:33.347 DEBUG:teuthology.orchestra.run.vm04:> stat /dev/vg_nvme/lv_1 2026-03-23T17:30:33.391 INFO:teuthology.orchestra.run.vm04.stdout: File: /dev/vg_nvme/lv_1 -> ../dm-0 2026-03-23T17:30:33.391 INFO:teuthology.orchestra.run.vm04.stdout: Size: 7 Blocks: 0 IO Block: 4096 symbolic link 2026-03-23T17:30:33.391 INFO:teuthology.orchestra.run.vm04.stdout:Device: 5h/5d Inode: 786 Links: 1 2026-03-23T17:30:33.391 INFO:teuthology.orchestra.run.vm04.stdout:Access: (0777/lrwxrwxrwx) Uid: ( 0/ root) Gid: ( 0/ root) 2026-03-23T17:30:33.391 INFO:teuthology.orchestra.run.vm04.stdout:Access: 2026-03-23 17:28:39.722621000 +0000 2026-03-23T17:30:33.391 INFO:teuthology.orchestra.run.vm04.stdout:Modify: 2026-03-23 17:28:39.590621000 +0000 2026-03-23T17:30:33.391 INFO:teuthology.orchestra.run.vm04.stdout:Change: 2026-03-23 17:28:39.590621000 +0000 2026-03-23T17:30:33.391 INFO:teuthology.orchestra.run.vm04.stdout: Birth: - 2026-03-23T17:30:33.391 DEBUG:teuthology.orchestra.run.vm04:> sudo dd if=/dev/vg_nvme/lv_1 of=/dev/null count=1 2026-03-23T17:30:33.438 INFO:teuthology.orchestra.run.vm04.stderr:1+0 records in 2026-03-23T17:30:33.438 INFO:teuthology.orchestra.run.vm04.stderr:1+0 records out 2026-03-23T17:30:33.438 INFO:teuthology.orchestra.run.vm04.stderr:512 bytes copied, 0.000158286 s, 3.2 MB/s 2026-03-23T17:30:33.439 DEBUG:teuthology.orchestra.run.vm04:> ! mount | grep -v devtmpfs | grep -q /dev/vg_nvme/lv_1 2026-03-23T17:30:33.484 DEBUG:teuthology.orchestra.run.vm04:> stat /dev/vg_nvme/lv_2 2026-03-23T17:30:33.531 INFO:teuthology.orchestra.run.vm04.stdout: File: /dev/vg_nvme/lv_2 -> ../dm-1 2026-03-23T17:30:33.531 INFO:teuthology.orchestra.run.vm04.stdout: Size: 7 Blocks: 0 IO Block: 4096 symbolic link 2026-03-23T17:30:33.531 INFO:teuthology.orchestra.run.vm04.stdout:Device: 5h/5d Inode: 818 Links: 1 2026-03-23T17:30:33.531 INFO:teuthology.orchestra.run.vm04.stdout:Access: (0777/lrwxrwxrwx) Uid: ( 0/ root) Gid: ( 0/ root) 2026-03-23T17:30:33.531 INFO:teuthology.orchestra.run.vm04.stdout:Access: 2026-03-23 17:28:40.010621000 +0000 2026-03-23T17:30:33.531 INFO:teuthology.orchestra.run.vm04.stdout:Modify: 2026-03-23 17:28:39.874621000 +0000 2026-03-23T17:30:33.531 INFO:teuthology.orchestra.run.vm04.stdout:Change: 2026-03-23 17:28:39.874621000 +0000 2026-03-23T17:30:33.531 INFO:teuthology.orchestra.run.vm04.stdout: Birth: - 2026-03-23T17:30:33.531 DEBUG:teuthology.orchestra.run.vm04:> sudo dd if=/dev/vg_nvme/lv_2 of=/dev/null count=1 2026-03-23T17:30:33.579 INFO:teuthology.orchestra.run.vm04.stderr:1+0 records in 2026-03-23T17:30:33.579 INFO:teuthology.orchestra.run.vm04.stderr:1+0 records out 2026-03-23T17:30:33.579 INFO:teuthology.orchestra.run.vm04.stderr:512 bytes copied, 0.0001552 s, 3.3 MB/s 2026-03-23T17:30:33.580 DEBUG:teuthology.orchestra.run.vm04:> ! mount | grep -v devtmpfs | grep -q /dev/vg_nvme/lv_2 2026-03-23T17:30:33.628 DEBUG:teuthology.orchestra.run.vm04:> stat /dev/vg_nvme/lv_3 2026-03-23T17:30:33.671 INFO:teuthology.orchestra.run.vm04.stdout: File: /dev/vg_nvme/lv_3 -> ../dm-2 2026-03-23T17:30:33.671 INFO:teuthology.orchestra.run.vm04.stdout: Size: 7 Blocks: 0 IO Block: 4096 symbolic link 2026-03-23T17:30:33.671 INFO:teuthology.orchestra.run.vm04.stdout:Device: 5h/5d Inode: 851 Links: 1 2026-03-23T17:30:33.671 INFO:teuthology.orchestra.run.vm04.stdout:Access: (0777/lrwxrwxrwx) Uid: ( 0/ root) Gid: ( 0/ root) 2026-03-23T17:30:33.671 INFO:teuthology.orchestra.run.vm04.stdout:Access: 2026-03-23 17:28:40.306621000 +0000 2026-03-23T17:30:33.671 INFO:teuthology.orchestra.run.vm04.stdout:Modify: 2026-03-23 17:28:40.178621000 +0000 2026-03-23T17:30:33.671 INFO:teuthology.orchestra.run.vm04.stdout:Change: 2026-03-23 17:28:40.178621000 +0000 2026-03-23T17:30:33.671 INFO:teuthology.orchestra.run.vm04.stdout: Birth: - 2026-03-23T17:30:33.671 DEBUG:teuthology.orchestra.run.vm04:> sudo dd if=/dev/vg_nvme/lv_3 of=/dev/null count=1 2026-03-23T17:30:33.718 INFO:teuthology.orchestra.run.vm04.stderr:1+0 records in 2026-03-23T17:30:33.719 INFO:teuthology.orchestra.run.vm04.stderr:1+0 records out 2026-03-23T17:30:33.719 INFO:teuthology.orchestra.run.vm04.stderr:512 bytes copied, 0.00017643 s, 2.9 MB/s 2026-03-23T17:30:33.719 DEBUG:teuthology.orchestra.run.vm04:> ! mount | grep -v devtmpfs | grep -q /dev/vg_nvme/lv_3 2026-03-23T17:30:33.763 DEBUG:teuthology.orchestra.run.vm04:> stat /dev/vg_nvme/lv_4 2026-03-23T17:30:33.806 INFO:teuthology.orchestra.run.vm04.stdout: File: /dev/vg_nvme/lv_4 -> ../dm-3 2026-03-23T17:30:33.806 INFO:teuthology.orchestra.run.vm04.stdout: Size: 7 Blocks: 0 IO Block: 4096 symbolic link 2026-03-23T17:30:33.806 INFO:teuthology.orchestra.run.vm04.stdout:Device: 5h/5d Inode: 877 Links: 1 2026-03-23T17:30:33.807 INFO:teuthology.orchestra.run.vm04.stdout:Access: (0777/lrwxrwxrwx) Uid: ( 0/ root) Gid: ( 0/ root) 2026-03-23T17:30:33.807 INFO:teuthology.orchestra.run.vm04.stdout:Access: 2026-03-23 17:28:44.354621000 +0000 2026-03-23T17:30:33.807 INFO:teuthology.orchestra.run.vm04.stdout:Modify: 2026-03-23 17:28:40.458621000 +0000 2026-03-23T17:30:33.807 INFO:teuthology.orchestra.run.vm04.stdout:Change: 2026-03-23 17:28:40.458621000 +0000 2026-03-23T17:30:33.807 INFO:teuthology.orchestra.run.vm04.stdout: Birth: - 2026-03-23T17:30:33.807 DEBUG:teuthology.orchestra.run.vm04:> sudo dd if=/dev/vg_nvme/lv_4 of=/dev/null count=1 2026-03-23T17:30:33.855 INFO:teuthology.orchestra.run.vm04.stderr:1+0 records in 2026-03-23T17:30:33.855 INFO:teuthology.orchestra.run.vm04.stderr:1+0 records out 2026-03-23T17:30:33.855 INFO:teuthology.orchestra.run.vm04.stderr:512 bytes copied, 0.000175208 s, 2.9 MB/s 2026-03-23T17:30:33.855 DEBUG:teuthology.orchestra.run.vm04:> ! mount | grep -v devtmpfs | grep -q /dev/vg_nvme/lv_4 2026-03-23T17:30:33.899 INFO:tasks.ceph:osd dev map: {'osd.0': '/dev/vg_nvme/lv_1', 'osd.1': '/dev/vg_nvme/lv_2', 'osd.2': '/dev/vg_nvme/lv_3'} 2026-03-23T17:30:33.899 INFO:tasks.ceph:remote_to_roles_to_devs: {Remote(name='ubuntu@vm04.local'): {'osd.0': '/dev/vg_nvme/lv_1', 'osd.1': '/dev/vg_nvme/lv_2', 'osd.2': '/dev/vg_nvme/lv_3'}} 2026-03-23T17:30:33.899 INFO:tasks.ceph:Generating config... 2026-03-23T17:30:33.900 INFO:tasks.ceph:[client] rbd default data pool = datapool 2026-03-23T17:30:33.900 INFO:tasks.ceph:[client] rbd default features = 61 2026-03-23T17:30:33.900 INFO:tasks.ceph:[global] mon client directed command retry = 5 2026-03-23T17:30:33.900 INFO:tasks.ceph:[global] mon warn on pool no app = False 2026-03-23T17:30:33.900 INFO:tasks.ceph:[global] ms inject socket failures = 5000 2026-03-23T17:30:33.900 INFO:tasks.ceph:[mgr] debug mgr = 20 2026-03-23T17:30:33.900 INFO:tasks.ceph:[mgr] debug ms = 1 2026-03-23T17:30:33.900 INFO:tasks.ceph:[mon] debug mon = 20 2026-03-23T17:30:33.900 INFO:tasks.ceph:[mon] debug ms = 1 2026-03-23T17:30:33.900 INFO:tasks.ceph:[mon] debug paxos = 20 2026-03-23T17:30:33.900 INFO:tasks.ceph:[osd] bluestore block size = 96636764160 2026-03-23T17:30:33.900 INFO:tasks.ceph:[osd] bluestore compression algorithm = zstd 2026-03-23T17:30:33.900 INFO:tasks.ceph:[osd] bluestore compression mode = aggressive 2026-03-23T17:30:33.900 INFO:tasks.ceph:[osd] bluestore fsck on mount = True 2026-03-23T17:30:33.900 INFO:tasks.ceph:[osd] debug bluefs = 1/20 2026-03-23T17:30:33.900 INFO:tasks.ceph:[osd] debug bluestore = 1/20 2026-03-23T17:30:33.900 INFO:tasks.ceph:[osd] debug ms = 1 2026-03-23T17:30:33.900 INFO:tasks.ceph:[osd] debug osd = 20 2026-03-23T17:30:33.900 INFO:tasks.ceph:[osd] debug rocksdb = 4/10 2026-03-23T17:30:33.900 INFO:tasks.ceph:[osd] enable experimental unrecoverable data corrupting features = * 2026-03-23T17:30:33.900 INFO:tasks.ceph:[osd] mon osd backfillfull_ratio = 0.85 2026-03-23T17:30:33.900 INFO:tasks.ceph:[osd] mon osd full ratio = 0.9 2026-03-23T17:30:33.900 INFO:tasks.ceph:[osd] mon osd nearfull ratio = 0.8 2026-03-23T17:30:33.900 INFO:tasks.ceph:[osd] osd debug randomize hobject sort order = False 2026-03-23T17:30:33.900 INFO:tasks.ceph:[osd] osd failsafe full ratio = 0.95 2026-03-23T17:30:33.901 INFO:tasks.ceph:[osd] osd mclock iops capacity threshold hdd = 49000 2026-03-23T17:30:33.901 INFO:tasks.ceph:[osd] osd objectstore = bluestore 2026-03-23T17:30:33.901 INFO:tasks.ceph:[osd] osd shutdown pgref assert = True 2026-03-23T17:30:33.901 INFO:tasks.ceph:Setting up mon.a... 2026-03-23T17:30:33.901 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool --create-keyring /etc/ceph/ceph.keyring 2026-03-23T17:30:33.956 INFO:teuthology.orchestra.run.vm04.stdout:creating /etc/ceph/ceph.keyring 2026-03-23T17:30:33.958 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool --gen-key --name=mon. /etc/ceph/ceph.keyring 2026-03-23T17:30:34.018 DEBUG:teuthology.orchestra.run.vm04:> sudo chmod 0644 /etc/ceph/ceph.keyring 2026-03-23T17:30:34.068 DEBUG:tasks.ceph:Ceph mon addresses: [('mon.a', '192.168.123.104')] 2026-03-23T17:30:34.068 DEBUG:tasks.ceph:writing out conf {'global': {'chdir': '', 'pid file': '/var/run/ceph/$cluster-$name.pid', 'auth supported': 'cephx', 'filestore xattr use omap': 'true', 'mon clock drift allowed': '1.000', 'osd crush chooseleaf type': '0', 'auth debug': 'true', 'ms die on old message': 'true', 'ms die on bug': 'true', 'mon max pg per osd': '10000', 'mon pg warn max object skew': '0', 'osd_pool_default_pg_autoscale_mode': 'off', 'osd pool default size': '2', 'mon osd allow primary affinity': 'true', 'mon osd allow pg remap': 'true', 'mon warn on legacy crush tunables': 'false', 'mon warn on crush straw calc version zero': 'false', 'mon warn on no sortbitwise': 'false', 'mon warn on osd down out interval zero': 'false', 'mon warn on too few osds': 'false', 'mon_warn_on_pool_pg_num_not_power_of_two': 'false', 'mon_warn_on_pool_no_redundancy': 'false', 'mon_allow_pool_size_one': 'true', 'osd pool default erasure code profile': 'plugin=isa technique=reed_sol_van k=2 m=1 crush-failure-domain=osd', 'osd default data pool replay window': '5', 'mon allow pool delete': 'true', 'mon cluster log file level': 'debug', 'debug asserts on shutdown': 'true', 'mon health detail to clog': 'false', 'mon host': '192.168.123.104', 'mon client directed command retry': 5, 'mon warn on pool no app': False, 'ms inject socket failures': 5000}, 'osd': {'osd journal size': '100', 'osd scrub load threshold': '5.0', 'osd scrub max interval': '600', 'osd mclock profile': 'high_recovery_ops', 'osd mclock skip benchmark': 'true', 'osd recover clone overlap': 'true', 'osd recovery max chunk': '1048576', 'osd debug shutdown': 'true', 'osd debug op order': 'true', 'osd debug verify stray on activate': 'true', 'osd debug trim objects': 'true', 'osd open classes on start': 'true', 'osd debug pg log writeout': 'true', 'osd deep scrub update digest min age': '30', 'osd map max advance': '10', 'journal zero on create': 'true', 'filestore ondisk finisher threads': '3', 'filestore apply finisher threads': '3', 'bdev debug aio': 'true', 'osd debug misdirected ops': 'true', 'bluestore block size': 96636764160, 'bluestore compression algorithm': 'zstd', 'bluestore compression mode': 'aggressive', 'bluestore fsck on mount': True, 'debug bluefs': '1/20', 'debug bluestore': '1/20', 'debug ms': 1, 'debug osd': 20, 'debug rocksdb': '4/10', 'enable experimental unrecoverable data corrupting features': '*', 'mon osd backfillfull_ratio': 0.85, 'mon osd full ratio': 0.9, 'mon osd nearfull ratio': 0.8, 'osd debug randomize hobject sort order': False, 'osd failsafe full ratio': 0.95, 'osd mclock iops capacity threshold hdd': 49000, 'osd objectstore': 'bluestore', 'osd shutdown pgref assert': True}, 'mgr': {'debug ms': 1, 'debug mgr': 20, 'debug mon': '20', 'debug auth': '20', 'mon reweight min pgs per osd': '4', 'mon reweight min bytes per osd': '10', 'mgr/telemetry/nag': 'false'}, 'mon': {'debug ms': 1, 'debug mon': 20, 'debug paxos': 20, 'debug auth': '20', 'mon data avail warn': '5', 'mon mgr mkfs grace': '240', 'mon reweight min pgs per osd': '4', 'mon osd reporter subtree level': 'osd', 'mon osd prime pg temp': 'true', 'mon reweight min bytes per osd': '10', 'auth mon ticket ttl': '660', 'auth service ticket ttl': '240', 'mon_warn_on_insecure_global_id_reclaim': 'false', 'mon_warn_on_insecure_global_id_reclaim_allowed': 'false', 'mon_down_mkfs_grace': '2m', 'mon_warn_on_filestore_osds': 'false'}, 'client': {'rgw cache enabled': 'true', 'rgw enable ops log': 'true', 'rgw enable usage log': 'true', 'log file': '/var/log/ceph/$cluster-$name.$pid.log', 'admin socket': '/var/run/ceph/$cluster-$name.$pid.asok', 'rbd default data pool': 'datapool', 'rbd default features': 61}, 'mon.a': {}} 2026-03-23T17:30:34.069 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-23T17:30:34.069 DEBUG:teuthology.orchestra.run.vm04:> dd of=/home/ubuntu/cephtest/ceph.tmp.conf 2026-03-23T17:30:34.115 DEBUG:teuthology.orchestra.run.vm04:> adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage monmaptool -c /home/ubuntu/cephtest/ceph.tmp.conf --create --clobber --enable-all-features --add a 192.168.123.104 --print /home/ubuntu/cephtest/ceph.monmap 2026-03-23T17:30:34.176 INFO:teuthology.orchestra.run.vm04.stdout:monmaptool: monmap file /home/ubuntu/cephtest/ceph.monmap 2026-03-23T17:30:34.176 INFO:teuthology.orchestra.run.vm04.stdout:monmaptool: generated fsid bd8fd485-7f5e-4dd1-a4fb-2f27337796a2 2026-03-23T17:30:34.176 INFO:teuthology.orchestra.run.vm04.stdout:setting min_mon_release = tentacle 2026-03-23T17:30:34.176 INFO:teuthology.orchestra.run.vm04.stdout:epoch 0 2026-03-23T17:30:34.176 INFO:teuthology.orchestra.run.vm04.stdout:fsid bd8fd485-7f5e-4dd1-a4fb-2f27337796a2 2026-03-23T17:30:34.176 INFO:teuthology.orchestra.run.vm04.stdout:last_changed 2026-03-23T17:30:34.176654+0000 2026-03-23T17:30:34.176 INFO:teuthology.orchestra.run.vm04.stdout:created 2026-03-23T17:30:34.176654+0000 2026-03-23T17:30:34.176 INFO:teuthology.orchestra.run.vm04.stdout:min_mon_release 20 (tentacle) 2026-03-23T17:30:34.176 INFO:teuthology.orchestra.run.vm04.stdout:election_strategy: 1 2026-03-23T17:30:34.176 INFO:teuthology.orchestra.run.vm04.stdout:0: [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] mon.a 2026-03-23T17:30:34.176 INFO:teuthology.orchestra.run.vm04.stdout:monmaptool: writing epoch 0 to /home/ubuntu/cephtest/ceph.monmap (1 monitors) 2026-03-23T17:30:34.177 DEBUG:teuthology.orchestra.run.vm04:> rm -- /home/ubuntu/cephtest/ceph.tmp.conf 2026-03-23T17:30:34.222 INFO:tasks.ceph:Writing /etc/ceph/ceph.conf for FSID bd8fd485-7f5e-4dd1-a4fb-2f27337796a2... 2026-03-23T17:30:34.223 DEBUG:teuthology.orchestra.run.vm04:> sudo mkdir -p /etc/ceph && sudo chmod 0755 /etc/ceph && sudo tee /etc/ceph/ceph.conf && sudo chmod 0644 /etc/ceph/ceph.conf > /dev/null 2026-03-23T17:30:34.282 INFO:teuthology.orchestra.run.vm04.stdout:[global] 2026-03-23T17:30:34.282 INFO:teuthology.orchestra.run.vm04.stdout: chdir = "" 2026-03-23T17:30:34.282 INFO:teuthology.orchestra.run.vm04.stdout: pid file = /var/run/ceph/$cluster-$name.pid 2026-03-23T17:30:34.282 INFO:teuthology.orchestra.run.vm04.stdout: auth supported = cephx 2026-03-23T17:30:34.282 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:34.282 INFO:teuthology.orchestra.run.vm04.stdout: filestore xattr use omap = true 2026-03-23T17:30:34.282 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: mon clock drift allowed = 1.000 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: osd crush chooseleaf type = 0 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: auth debug = true 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: ms die on old message = true 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: ms die on bug = true 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: mon max pg per osd = 10000 # >= luminous 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: mon pg warn max object skew = 0 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: # disable pg_autoscaler by default for new pools 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: osd_pool_default_pg_autoscale_mode = off 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: osd pool default size = 2 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: mon osd allow primary affinity = true 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: mon osd allow pg remap = true 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: mon warn on legacy crush tunables = false 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: mon warn on crush straw calc version zero = false 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: mon warn on no sortbitwise = false 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: mon warn on osd down out interval zero = false 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: mon warn on too few osds = false 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: mon_warn_on_pool_pg_num_not_power_of_two = false 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: mon_warn_on_pool_no_redundancy = false 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: mon_allow_pool_size_one = true 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: osd pool default erasure code profile = plugin=isa technique=reed_sol_van k=2 m=1 crush-failure-domain=osd 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: osd default data pool replay window = 5 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: mon allow pool delete = true 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: mon cluster log file level = debug 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: debug asserts on shutdown = true 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: mon health detail to clog = false 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: mon host = 192.168.123.104 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: mon client directed command retry = 5 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: mon warn on pool no app = False 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: ms inject socket failures = 5000 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: fsid = bd8fd485-7f5e-4dd1-a4fb-2f27337796a2 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout:[osd] 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: osd journal size = 100 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: osd scrub load threshold = 5.0 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: osd scrub max interval = 600 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: osd mclock profile = high_recovery_ops 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: osd mclock skip benchmark = true 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: osd recover clone overlap = true 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: osd recovery max chunk = 1048576 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: osd debug shutdown = true 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: osd debug op order = true 2026-03-23T17:30:34.283 INFO:teuthology.orchestra.run.vm04.stdout: osd debug verify stray on activate = true 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: osd debug trim objects = true 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: osd open classes on start = true 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: osd debug pg log writeout = true 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: osd deep scrub update digest min age = 30 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: osd map max advance = 10 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: journal zero on create = true 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: filestore ondisk finisher threads = 3 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: filestore apply finisher threads = 3 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: bdev debug aio = true 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: osd debug misdirected ops = true 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: bluestore block size = 96636764160 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: bluestore compression algorithm = zstd 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: bluestore compression mode = aggressive 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: bluestore fsck on mount = True 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: debug bluefs = 1/20 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: debug bluestore = 1/20 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: debug ms = 1 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: debug osd = 20 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: debug rocksdb = 4/10 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: enable experimental unrecoverable data corrupting features = * 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: mon osd backfillfull_ratio = 0.85 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: mon osd full ratio = 0.9 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: mon osd nearfull ratio = 0.8 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: osd debug randomize hobject sort order = False 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: osd failsafe full ratio = 0.95 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: osd mclock iops capacity threshold hdd = 49000 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: osd objectstore = bluestore 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: osd shutdown pgref assert = True 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout:[mgr] 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: debug ms = 1 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: debug mgr = 20 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: debug mon = 20 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: debug auth = 20 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: mon reweight min pgs per osd = 4 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: mon reweight min bytes per osd = 10 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: mgr/telemetry/nag = false 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout:[mon] 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: debug ms = 1 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: debug mon = 20 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: debug paxos = 20 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: debug auth = 20 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: mon data avail warn = 5 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: mon mgr mkfs grace = 240 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: mon reweight min pgs per osd = 4 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: mon osd reporter subtree level = osd 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: mon osd prime pg temp = true 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: mon reweight min bytes per osd = 10 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: # rotate auth tickets quickly to exercise renewal paths 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: auth mon ticket ttl = 660 # 11m 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: auth service ticket ttl = 240 # 4m 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: # don't complain about insecure global_id in the test suite 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: mon_warn_on_insecure_global_id_reclaim = false 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: mon_warn_on_insecure_global_id_reclaim_allowed = false 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: # 1m isn't quite enough 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: mon_down_mkfs_grace = 2m 2026-03-23T17:30:34.284 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:34.285 INFO:teuthology.orchestra.run.vm04.stdout: mon_warn_on_filestore_osds = false 2026-03-23T17:30:34.285 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:34.285 INFO:teuthology.orchestra.run.vm04.stdout:[client] 2026-03-23T17:30:34.285 INFO:teuthology.orchestra.run.vm04.stdout: rgw cache enabled = true 2026-03-23T17:30:34.285 INFO:teuthology.orchestra.run.vm04.stdout: rgw enable ops log = true 2026-03-23T17:30:34.285 INFO:teuthology.orchestra.run.vm04.stdout: rgw enable usage log = true 2026-03-23T17:30:34.285 INFO:teuthology.orchestra.run.vm04.stdout: log file = /var/log/ceph/$cluster-$name.$pid.log 2026-03-23T17:30:34.285 INFO:teuthology.orchestra.run.vm04.stdout: admin socket = /var/run/ceph/$cluster-$name.$pid.asok 2026-03-23T17:30:34.285 INFO:teuthology.orchestra.run.vm04.stdout: rbd default data pool = datapool 2026-03-23T17:30:34.285 INFO:teuthology.orchestra.run.vm04.stdout: rbd default features = 61 2026-03-23T17:30:34.285 INFO:teuthology.orchestra.run.vm04.stdout:[mon.a] 2026-03-23T17:30:34.288 INFO:tasks.ceph:Creating admin key on mon.a... 2026-03-23T17:30:34.289 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool --gen-key --name=client.admin --cap mon 'allow *' --cap osd 'allow *' --cap mds 'allow *' --cap mgr 'allow *' /etc/ceph/ceph.keyring 2026-03-23T17:30:34.357 INFO:tasks.ceph:Copying monmap to all nodes... 2026-03-23T17:30:34.357 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-23T17:30:34.357 DEBUG:teuthology.orchestra.run.vm04:> dd if=/etc/ceph/ceph.keyring of=/dev/stdout 2026-03-23T17:30:34.403 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-23T17:30:34.403 DEBUG:teuthology.orchestra.run.vm04:> dd if=/home/ubuntu/cephtest/ceph.monmap of=/dev/stdout 2026-03-23T17:30:34.447 INFO:tasks.ceph:Sending monmap to node ubuntu@vm04.local 2026-03-23T17:30:34.447 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-23T17:30:34.447 DEBUG:teuthology.orchestra.run.vm04:> sudo dd of=/etc/ceph/ceph.keyring 2026-03-23T17:30:34.447 DEBUG:teuthology.orchestra.run.vm04:> sudo chmod 0644 /etc/ceph/ceph.keyring 2026-03-23T17:30:34.502 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-23T17:30:34.502 DEBUG:teuthology.orchestra.run.vm04:> dd of=/home/ubuntu/cephtest/ceph.monmap 2026-03-23T17:30:34.547 INFO:tasks.ceph:Setting up mon nodes... 2026-03-23T17:30:34.554 INFO:tasks.ceph:Setting up mgr nodes... 2026-03-23T17:30:34.554 DEBUG:teuthology.orchestra.run.vm04:> sudo mkdir -p /var/lib/ceph/mgr/ceph-x && sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool --create-keyring --gen-key --name=mgr.x /var/lib/ceph/mgr/ceph-x/keyring 2026-03-23T17:30:34.610 INFO:teuthology.orchestra.run.vm04.stdout:creating /var/lib/ceph/mgr/ceph-x/keyring 2026-03-23T17:30:34.612 INFO:tasks.ceph:Setting up mds nodes... 2026-03-23T17:30:34.612 INFO:tasks.ceph_client:Setting up client nodes... 2026-03-23T17:30:34.612 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool --create-keyring --gen-key --name=client.0 /etc/ceph/ceph.client.0.keyring && sudo chmod 0644 /etc/ceph/ceph.client.0.keyring 2026-03-23T17:30:34.673 INFO:teuthology.orchestra.run.vm04.stdout:creating /etc/ceph/ceph.client.0.keyring 2026-03-23T17:30:34.681 INFO:tasks.ceph:Running mkfs on osd nodes... 2026-03-23T17:30:34.681 INFO:tasks.ceph:ctx.disk_config.remote_to_roles_to_dev: {Remote(name='ubuntu@vm04.local'): {'osd.0': '/dev/vg_nvme/lv_1', 'osd.1': '/dev/vg_nvme/lv_2', 'osd.2': '/dev/vg_nvme/lv_3'}} 2026-03-23T17:30:34.681 DEBUG:teuthology.orchestra.run.vm04:> sudo mkdir -p /var/lib/ceph/osd/ceph-0 2026-03-23T17:30:34.735 INFO:tasks.ceph:roles_to_devs: {'osd.0': '/dev/vg_nvme/lv_1', 'osd.1': '/dev/vg_nvme/lv_2', 'osd.2': '/dev/vg_nvme/lv_3'} 2026-03-23T17:30:34.735 INFO:tasks.ceph:role: osd.0 2026-03-23T17:30:34.735 INFO:tasks.ceph:['mkfs.xfs', '-f', '-i', 'size=2048'] on /dev/vg_nvme/lv_1 on ubuntu@vm04.local 2026-03-23T17:30:34.736 DEBUG:teuthology.orchestra.run.vm04:> yes | sudo mkfs.xfs -f -i size=2048 /dev/vg_nvme/lv_1 2026-03-23T17:30:34.785 INFO:teuthology.orchestra.run.vm04.stdout:meta-data=/dev/vg_nvme/lv_1 isize=2048 agcount=4, agsize=1310464 blks 2026-03-23T17:30:34.785 INFO:teuthology.orchestra.run.vm04.stdout: = sectsz=512 attr=2, projid32bit=1 2026-03-23T17:30:34.785 INFO:teuthology.orchestra.run.vm04.stdout: = crc=1 finobt=1, sparse=1, rmapbt=0 2026-03-23T17:30:34.785 INFO:teuthology.orchestra.run.vm04.stdout: = reflink=1 bigtime=0 inobtcount=0 2026-03-23T17:30:34.785 INFO:teuthology.orchestra.run.vm04.stdout:data = bsize=4096 blocks=5241856, imaxpct=25 2026-03-23T17:30:34.785 INFO:teuthology.orchestra.run.vm04.stdout: = sunit=0 swidth=0 blks 2026-03-23T17:30:34.785 INFO:teuthology.orchestra.run.vm04.stdout:naming =version 2 bsize=4096 ascii-ci=0, ftype=1 2026-03-23T17:30:34.785 INFO:teuthology.orchestra.run.vm04.stdout:log =internal log bsize=4096 blocks=2560, version=2 2026-03-23T17:30:34.785 INFO:teuthology.orchestra.run.vm04.stdout: = sectsz=512 sunit=0 blks, lazy-count=1 2026-03-23T17:30:34.785 INFO:teuthology.orchestra.run.vm04.stdout:realtime =none extsz=4096 blocks=0, rtextents=0 2026-03-23T17:30:34.789 INFO:teuthology.orchestra.run.vm04.stdout:Discarding blocks...Done. 2026-03-23T17:30:34.790 INFO:tasks.ceph:mount /dev/vg_nvme/lv_1 on ubuntu@vm04.local -o noatime 2026-03-23T17:30:34.790 DEBUG:teuthology.orchestra.run.vm04:> sudo mount -t xfs -o noatime /dev/vg_nvme/lv_1 /var/lib/ceph/osd/ceph-0 2026-03-23T17:30:34.883 DEBUG:teuthology.orchestra.run.vm04:> sudo /sbin/restorecon /var/lib/ceph/osd/ceph-0 2026-03-23T17:30:34.889 INFO:teuthology.orchestra.run.vm04.stderr:sudo: /sbin/restorecon: command not found 2026-03-23T17:30:34.889 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-23T17:30:34.889 DEBUG:teuthology.orchestra.run.vm04:> sudo mkdir -p /var/lib/ceph/osd/ceph-1 2026-03-23T17:30:34.939 INFO:tasks.ceph:roles_to_devs: {'osd.0': '/dev/vg_nvme/lv_1', 'osd.1': '/dev/vg_nvme/lv_2', 'osd.2': '/dev/vg_nvme/lv_3'} 2026-03-23T17:30:34.939 INFO:tasks.ceph:role: osd.1 2026-03-23T17:30:34.940 INFO:tasks.ceph:['mkfs.xfs', '-f', '-i', 'size=2048'] on /dev/vg_nvme/lv_2 on ubuntu@vm04.local 2026-03-23T17:30:34.940 DEBUG:teuthology.orchestra.run.vm04:> yes | sudo mkfs.xfs -f -i size=2048 /dev/vg_nvme/lv_2 2026-03-23T17:30:34.988 INFO:teuthology.orchestra.run.vm04.stdout:meta-data=/dev/vg_nvme/lv_2 isize=2048 agcount=4, agsize=1310464 blks 2026-03-23T17:30:34.988 INFO:teuthology.orchestra.run.vm04.stdout: = sectsz=512 attr=2, projid32bit=1 2026-03-23T17:30:34.988 INFO:teuthology.orchestra.run.vm04.stdout: = crc=1 finobt=1, sparse=1, rmapbt=0 2026-03-23T17:30:34.988 INFO:teuthology.orchestra.run.vm04.stdout: = reflink=1 bigtime=0 inobtcount=0 2026-03-23T17:30:34.988 INFO:teuthology.orchestra.run.vm04.stdout:data = bsize=4096 blocks=5241856, imaxpct=25 2026-03-23T17:30:34.988 INFO:teuthology.orchestra.run.vm04.stdout: = sunit=0 swidth=0 blks 2026-03-23T17:30:34.988 INFO:teuthology.orchestra.run.vm04.stdout:naming =version 2 bsize=4096 ascii-ci=0, ftype=1 2026-03-23T17:30:34.988 INFO:teuthology.orchestra.run.vm04.stdout:log =internal log bsize=4096 blocks=2560, version=2 2026-03-23T17:30:34.988 INFO:teuthology.orchestra.run.vm04.stdout: = sectsz=512 sunit=0 blks, lazy-count=1 2026-03-23T17:30:34.988 INFO:teuthology.orchestra.run.vm04.stdout:realtime =none extsz=4096 blocks=0, rtextents=0 2026-03-23T17:30:34.994 INFO:teuthology.orchestra.run.vm04.stdout:Discarding blocks...Done. 2026-03-23T17:30:34.995 INFO:tasks.ceph:mount /dev/vg_nvme/lv_2 on ubuntu@vm04.local -o noatime 2026-03-23T17:30:34.995 DEBUG:teuthology.orchestra.run.vm04:> sudo mount -t xfs -o noatime /dev/vg_nvme/lv_2 /var/lib/ceph/osd/ceph-1 2026-03-23T17:30:35.049 DEBUG:teuthology.orchestra.run.vm04:> sudo /sbin/restorecon /var/lib/ceph/osd/ceph-1 2026-03-23T17:30:35.101 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-23T17:30:35.102 INFO:teuthology.orchestra.run.vm04.stderr:sudo: /sbin/restorecon: command not found 2026-03-23T17:30:35.102 DEBUG:teuthology.orchestra.run.vm04:> sudo mkdir -p /var/lib/ceph/osd/ceph-2 2026-03-23T17:30:35.156 INFO:tasks.ceph:roles_to_devs: {'osd.0': '/dev/vg_nvme/lv_1', 'osd.1': '/dev/vg_nvme/lv_2', 'osd.2': '/dev/vg_nvme/lv_3'} 2026-03-23T17:30:35.156 INFO:tasks.ceph:role: osd.2 2026-03-23T17:30:35.156 INFO:tasks.ceph:['mkfs.xfs', '-f', '-i', 'size=2048'] on /dev/vg_nvme/lv_3 on ubuntu@vm04.local 2026-03-23T17:30:35.156 DEBUG:teuthology.orchestra.run.vm04:> yes | sudo mkfs.xfs -f -i size=2048 /dev/vg_nvme/lv_3 2026-03-23T17:30:35.208 INFO:teuthology.orchestra.run.vm04.stdout:meta-data=/dev/vg_nvme/lv_3 isize=2048 agcount=4, agsize=1310464 blks 2026-03-23T17:30:35.208 INFO:teuthology.orchestra.run.vm04.stdout: = sectsz=512 attr=2, projid32bit=1 2026-03-23T17:30:35.208 INFO:teuthology.orchestra.run.vm04.stdout: = crc=1 finobt=1, sparse=1, rmapbt=0 2026-03-23T17:30:35.208 INFO:teuthology.orchestra.run.vm04.stdout: = reflink=1 bigtime=0 inobtcount=0 2026-03-23T17:30:35.208 INFO:teuthology.orchestra.run.vm04.stdout:data = bsize=4096 blocks=5241856, imaxpct=25 2026-03-23T17:30:35.208 INFO:teuthology.orchestra.run.vm04.stdout: = sunit=0 swidth=0 blks 2026-03-23T17:30:35.208 INFO:teuthology.orchestra.run.vm04.stdout:naming =version 2 bsize=4096 ascii-ci=0, ftype=1 2026-03-23T17:30:35.208 INFO:teuthology.orchestra.run.vm04.stdout:log =internal log bsize=4096 blocks=2560, version=2 2026-03-23T17:30:35.209 INFO:teuthology.orchestra.run.vm04.stdout: = sectsz=512 sunit=0 blks, lazy-count=1 2026-03-23T17:30:35.209 INFO:teuthology.orchestra.run.vm04.stdout:realtime =none extsz=4096 blocks=0, rtextents=0 2026-03-23T17:30:35.228 INFO:teuthology.orchestra.run.vm04.stdout:Discarding blocks...Done. 2026-03-23T17:30:35.230 INFO:tasks.ceph:mount /dev/vg_nvme/lv_3 on ubuntu@vm04.local -o noatime 2026-03-23T17:30:35.230 DEBUG:teuthology.orchestra.run.vm04:> sudo mount -t xfs -o noatime /dev/vg_nvme/lv_3 /var/lib/ceph/osd/ceph-2 2026-03-23T17:30:35.286 DEBUG:teuthology.orchestra.run.vm04:> sudo /sbin/restorecon /var/lib/ceph/osd/ceph-2 2026-03-23T17:30:35.337 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-23T17:30:35.337 INFO:teuthology.orchestra.run.vm04.stderr:sudo: /sbin/restorecon: command not found 2026-03-23T17:30:35.338 DEBUG:teuthology.orchestra.run.vm04:> sudo MALLOC_CHECK_=3 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-osd --no-mon-config --cluster ceph --mkfs --mkkey -i 0 --monmap /home/ubuntu/cephtest/ceph.monmap 2026-03-23T17:30:35.404 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-23T17:30:35.398+0000 7f4b33860a40 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-23T17:30:35.408 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-23T17:30:35.398+0000 7f4b33860a40 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-23T17:30:35.409 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-23T17:30:35.402+0000 7f4b33860a40 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-23T17:30:35.409 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-23T17:30:35.402+0000 7f4b33860a40 -1 auth: error reading file: /var/lib/ceph/osd/ceph-0/keyring: can't open /var/lib/ceph/osd/ceph-0/keyring: (2) No such file or directory 2026-03-23T17:30:35.409 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-23T17:30:35.406+0000 7f4b33860a40 -1 created new key in keyring /var/lib/ceph/osd/ceph-0/keyring 2026-03-23T17:30:35.409 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-23T17:30:35.406+0000 7f4b33860a40 -1 bdev(0x559974f0b800 /var/lib/ceph/osd/ceph-0/block) open stat got: (1) Operation not permitted 2026-03-23T17:30:35.409 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-23T17:30:35.406+0000 7f4b33860a40 -1 bluestore(/var/lib/ceph/osd/ceph-0) _read_fsid unparsable uuid 2026-03-23T17:30:36.389 DEBUG:teuthology.orchestra.run.vm04:> sudo chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 2026-03-23T17:30:36.444 DEBUG:teuthology.orchestra.run.vm04:> sudo MALLOC_CHECK_=3 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-osd --no-mon-config --cluster ceph --mkfs --mkkey -i 1 --monmap /home/ubuntu/cephtest/ceph.monmap 2026-03-23T17:30:36.514 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-23T17:30:36.510+0000 7f4afd692a40 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-23T17:30:36.514 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-23T17:30:36.510+0000 7f4afd692a40 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-23T17:30:36.515 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-23T17:30:36.510+0000 7f4afd692a40 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-23T17:30:36.516 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-23T17:30:36.514+0000 7f4afd692a40 -1 auth: error reading file: /var/lib/ceph/osd/ceph-1/keyring: can't open /var/lib/ceph/osd/ceph-1/keyring: (2) No such file or directory 2026-03-23T17:30:36.516 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-23T17:30:36.514+0000 7f4afd692a40 -1 created new key in keyring /var/lib/ceph/osd/ceph-1/keyring 2026-03-23T17:30:36.516 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-23T17:30:36.514+0000 7f4afd692a40 -1 bdev(0x5630957b7800 /var/lib/ceph/osd/ceph-1/block) open stat got: (1) Operation not permitted 2026-03-23T17:30:36.517 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-23T17:30:36.514+0000 7f4afd692a40 -1 bluestore(/var/lib/ceph/osd/ceph-1) _read_fsid unparsable uuid 2026-03-23T17:30:37.498 DEBUG:teuthology.orchestra.run.vm04:> sudo chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 2026-03-23T17:30:37.551 DEBUG:teuthology.orchestra.run.vm04:> sudo MALLOC_CHECK_=3 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-osd --no-mon-config --cluster ceph --mkfs --mkkey -i 2 --monmap /home/ubuntu/cephtest/ceph.monmap 2026-03-23T17:30:37.616 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-23T17:30:37.614+0000 7f6cb6911a40 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-23T17:30:37.616 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-23T17:30:37.614+0000 7f6cb6911a40 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-23T17:30:37.617 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-23T17:30:37.614+0000 7f6cb6911a40 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-23T17:30:37.618 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-23T17:30:37.614+0000 7f6cb6911a40 -1 auth: error reading file: /var/lib/ceph/osd/ceph-2/keyring: can't open /var/lib/ceph/osd/ceph-2/keyring: (2) No such file or directory 2026-03-23T17:30:37.618 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-23T17:30:37.614+0000 7f6cb6911a40 -1 created new key in keyring /var/lib/ceph/osd/ceph-2/keyring 2026-03-23T17:30:37.618 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-23T17:30:37.614+0000 7f6cb6911a40 -1 bdev(0x559454b43800 /var/lib/ceph/osd/ceph-2/block) open stat got: (1) Operation not permitted 2026-03-23T17:30:37.618 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-23T17:30:37.614+0000 7f6cb6911a40 -1 bluestore(/var/lib/ceph/osd/ceph-2) _read_fsid unparsable uuid 2026-03-23T17:30:38.617 DEBUG:teuthology.orchestra.run.vm04:> sudo chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 2026-03-23T17:30:38.669 INFO:tasks.ceph:Reading keys from all nodes... 2026-03-23T17:30:38.669 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-23T17:30:38.669 DEBUG:teuthology.orchestra.run.vm04:> sudo dd if=/var/lib/ceph/mgr/ceph-x/keyring of=/dev/stdout 2026-03-23T17:30:38.721 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-23T17:30:38.721 DEBUG:teuthology.orchestra.run.vm04:> sudo dd if=/var/lib/ceph/osd/ceph-0/keyring of=/dev/stdout 2026-03-23T17:30:38.775 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-23T17:30:38.775 DEBUG:teuthology.orchestra.run.vm04:> sudo dd if=/var/lib/ceph/osd/ceph-1/keyring of=/dev/stdout 2026-03-23T17:30:38.825 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-23T17:30:38.825 DEBUG:teuthology.orchestra.run.vm04:> sudo dd if=/var/lib/ceph/osd/ceph-2/keyring of=/dev/stdout 2026-03-23T17:30:38.877 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-23T17:30:38.877 DEBUG:teuthology.orchestra.run.vm04:> dd if=/etc/ceph/ceph.client.0.keyring of=/dev/stdout 2026-03-23T17:30:38.924 INFO:tasks.ceph:Adding keys to all mons... 2026-03-23T17:30:38.925 DEBUG:teuthology.orchestra.run.vm04:> sudo tee -a /etc/ceph/ceph.keyring 2026-03-23T17:30:38.975 INFO:teuthology.orchestra.run.vm04.stdout:[mgr.x] 2026-03-23T17:30:38.975 INFO:teuthology.orchestra.run.vm04.stdout: key = AQC6eMFpv+ZoJBAA8THtV7Hn/QZx46wtDNJAoQ== 2026-03-23T17:30:38.975 INFO:teuthology.orchestra.run.vm04.stdout:[osd.0] 2026-03-23T17:30:38.975 INFO:teuthology.orchestra.run.vm04.stdout: key = AQC7eMFpVi84GBAASIifoibzvoB2gavF+C7uaw== 2026-03-23T17:30:38.976 INFO:teuthology.orchestra.run.vm04.stdout:[osd.1] 2026-03-23T17:30:38.976 INFO:teuthology.orchestra.run.vm04.stdout: key = AQC8eMFpDAXQHhAAUL9e7azCYD543uTBEbKm5Q== 2026-03-23T17:30:38.976 INFO:teuthology.orchestra.run.vm04.stdout:[osd.2] 2026-03-23T17:30:38.976 INFO:teuthology.orchestra.run.vm04.stdout: key = AQC9eMFppW/iJBAA+xwoeOOxv/keknyWHtEopQ== 2026-03-23T17:30:38.976 INFO:teuthology.orchestra.run.vm04.stdout:[client.0] 2026-03-23T17:30:38.976 INFO:teuthology.orchestra.run.vm04.stdout: key = AQC6eMFpQ/AvKBAAV9vBxoNuTUR6lm0Kf71P8w== 2026-03-23T17:30:38.976 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool /etc/ceph/ceph.keyring --name=mgr.x --cap mon 'allow profile mgr' --cap osd 'allow *' --cap mds 'allow *' 2026-03-23T17:30:39.041 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool /etc/ceph/ceph.keyring --name=osd.0 --cap mon 'allow profile osd' --cap mgr 'allow profile osd' --cap osd 'allow *' 2026-03-23T17:30:39.105 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool /etc/ceph/ceph.keyring --name=osd.1 --cap mon 'allow profile osd' --cap mgr 'allow profile osd' --cap osd 'allow *' 2026-03-23T17:30:39.176 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool /etc/ceph/ceph.keyring --name=osd.2 --cap mon 'allow profile osd' --cap mgr 'allow profile osd' --cap osd 'allow *' 2026-03-23T17:30:39.249 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool /etc/ceph/ceph.keyring --name=client.0 --cap mon 'allow rw' --cap mgr 'allow r' --cap osd 'allow rwx' --cap mds allow 2026-03-23T17:30:39.318 INFO:tasks.ceph:Running mkfs on mon nodes... 2026-03-23T17:30:39.319 DEBUG:teuthology.orchestra.run.vm04:> sudo mkdir -p /var/lib/ceph/mon/ceph-a 2026-03-23T17:30:39.373 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-mon --cluster ceph --mkfs -i a --monmap /home/ubuntu/cephtest/ceph.monmap --keyring /etc/ceph/ceph.keyring 2026-03-23T17:30:39.473 DEBUG:teuthology.orchestra.run.vm04:> sudo chown -R ceph:ceph /var/lib/ceph/mon/ceph-a 2026-03-23T17:30:39.481 DEBUG:teuthology.orchestra.run.vm04:> rm -- /home/ubuntu/cephtest/ceph.monmap 2026-03-23T17:30:39.527 INFO:tasks.ceph:Starting mon daemons in cluster ceph... 2026-03-23T17:30:39.527 INFO:tasks.ceph.mon.a:Restarting daemon 2026-03-23T17:30:39.527 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-mon -f --cluster ceph -i a 2026-03-23T17:30:39.569 INFO:tasks.ceph.mon.a:Started 2026-03-23T17:30:39.569 INFO:tasks.ceph:Starting mgr daemons in cluster ceph... 2026-03-23T17:30:39.569 INFO:tasks.ceph.mgr.x:Restarting daemon 2026-03-23T17:30:39.569 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-mgr -f --cluster ceph -i x 2026-03-23T17:30:39.570 INFO:tasks.ceph.mgr.x:Started 2026-03-23T17:30:39.570 DEBUG:tasks.ceph:set 0 configs 2026-03-23T17:30:39.570 DEBUG:teuthology.orchestra.run.vm04:> sudo ceph --cluster ceph config dump 2026-03-23T17:30:39.705 INFO:teuthology.orchestra.run.vm04.stdout:WHO MASK LEVEL OPTION VALUE RO 2026-03-23T17:30:39.718 INFO:tasks.ceph:Setting crush tunables to default 2026-03-23T17:30:39.718 DEBUG:teuthology.orchestra.run.vm04:> sudo ceph --cluster ceph osd crush tunables default 2026-03-23T17:30:39.868 INFO:teuthology.orchestra.run.vm04.stderr:adjusted tunables profile to default 2026-03-23T17:30:39.883 INFO:tasks.ceph:check_enable_crimson: False 2026-03-23T17:30:39.883 INFO:tasks.ceph:Starting osd daemons in cluster ceph... 2026-03-23T17:30:39.884 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-23T17:30:39.884 DEBUG:teuthology.orchestra.run.vm04:> sudo dd if=/var/lib/ceph/osd/ceph-0/fsid of=/dev/stdout 2026-03-23T17:30:39.893 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-23T17:30:39.893 DEBUG:teuthology.orchestra.run.vm04:> sudo dd if=/var/lib/ceph/osd/ceph-1/fsid of=/dev/stdout 2026-03-23T17:30:39.946 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-23T17:30:39.946 DEBUG:teuthology.orchestra.run.vm04:> sudo dd if=/var/lib/ceph/osd/ceph-2/fsid of=/dev/stdout 2026-03-23T17:30:39.997 DEBUG:teuthology.orchestra.run.vm04:> sudo ceph --cluster ceph osd new d91b1ac9-820d-4f5b-83d6-7a015ca26296 0 2026-03-23T17:30:40.170 INFO:teuthology.orchestra.run.vm04.stdout:0 2026-03-23T17:30:40.185 DEBUG:teuthology.orchestra.run.vm04:> sudo ceph --cluster ceph osd new b630c322-76af-4a70-9b41-7c22a53ab1d6 1 2026-03-23T17:30:40.307 INFO:teuthology.orchestra.run.vm04.stdout:1 2026-03-23T17:30:40.324 DEBUG:teuthology.orchestra.run.vm04:> sudo ceph --cluster ceph osd new d8611821-f18a-4353-a7f8-27b1691155ff 2 2026-03-23T17:30:40.458 INFO:teuthology.orchestra.run.vm04.stdout:2 2026-03-23T17:30:40.473 INFO:tasks.ceph.osd.0:Restarting daemon 2026-03-23T17:30:40.473 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 0 2026-03-23T17:30:40.474 INFO:tasks.ceph.osd.0:Started 2026-03-23T17:30:40.474 INFO:tasks.ceph.osd.1:Restarting daemon 2026-03-23T17:30:40.474 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 1 2026-03-23T17:30:40.475 INFO:tasks.ceph.osd.1:Started 2026-03-23T17:30:40.475 INFO:tasks.ceph.osd.2:Restarting daemon 2026-03-23T17:30:40.475 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 2 2026-03-23T17:30:40.476 INFO:tasks.ceph.osd.2:Started 2026-03-23T17:30:40.477 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd dump --format=json 2026-03-23T17:30:40.524 INFO:tasks.ceph.osd.0.vm04.stderr:2026-03-23T17:30:40.510+0000 7f2fe46c5a40 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-23T17:30:40.525 INFO:tasks.ceph.osd.1.vm04.stderr:2026-03-23T17:30:40.518+0000 7f803e0a9a40 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-23T17:30:40.528 INFO:tasks.ceph.osd.2.vm04.stderr:2026-03-23T17:30:40.522+0000 7f9fac875a40 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-23T17:30:40.529 INFO:tasks.ceph.osd.0.vm04.stderr:2026-03-23T17:30:40.526+0000 7f2fe46c5a40 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-23T17:30:40.529 INFO:tasks.ceph.osd.0.vm04.stderr:2026-03-23T17:30:40.526+0000 7f2fe46c5a40 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-23T17:30:40.532 INFO:tasks.ceph.osd.1.vm04.stderr:2026-03-23T17:30:40.530+0000 7f803e0a9a40 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-23T17:30:40.533 INFO:tasks.ceph.osd.1.vm04.stderr:2026-03-23T17:30:40.530+0000 7f803e0a9a40 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-23T17:30:40.537 INFO:tasks.ceph.osd.2.vm04.stderr:2026-03-23T17:30:40.534+0000 7f9fac875a40 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-23T17:30:40.538 INFO:tasks.ceph.osd.2.vm04.stderr:2026-03-23T17:30:40.534+0000 7f9fac875a40 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-23T17:30:40.627 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:40.627 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":5,"fsid":"bd8fd485-7f5e-4dd1-a4fb-2f27337796a2","created":"2026-03-23T17:30:39.647861+0000","modified":"2026-03-23T17:30:40.456871+0000","last_up_change":"0.000000","last_in_change":"2026-03-23T17:30:40.456871+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":2,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":0,"max_osd":3,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"tentacle","allow_crimson":false,"pools":[],"osds":[{"osd":0,"uuid":"d91b1ac9-820d-4f5b-83d6-7a015ca26296","up":0,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":0,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[]},"cluster_addrs":{"addrvec":[]},"heartbeat_back_addrs":{"addrvec":[]},"heartbeat_front_addrs":{"addrvec":[]},"public_addr":"(unrecognized address family 0)/0","cluster_addr":"(unrecognized address family 0)/0","heartbeat_back_addr":"(unrecognized address family 0)/0","heartbeat_front_addr":"(unrecognized address family 0)/0","state":["exists","new"]},{"osd":1,"uuid":"b630c322-76af-4a70-9b41-7c22a53ab1d6","up":0,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":0,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[]},"cluster_addrs":{"addrvec":[]},"heartbeat_back_addrs":{"addrvec":[]},"heartbeat_front_addrs":{"addrvec":[]},"public_addr":"(unrecognized address family 0)/0","cluster_addr":"(unrecognized address family 0)/0","heartbeat_back_addr":"(unrecognized address family 0)/0","heartbeat_front_addr":"(unrecognized address family 0)/0","state":["exists","new"]},{"osd":2,"uuid":"d8611821-f18a-4353-a7f8-27b1691155ff","up":0,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":0,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[]},"cluster_addrs":{"addrvec":[]},"heartbeat_back_addrs":{"addrvec":[]},"heartbeat_front_addrs":{"addrvec":[]},"public_addr":"(unrecognized address family 0)/0","cluster_addr":"(unrecognized address family 0)/0","heartbeat_back_addr":"(unrecognized address family 0)/0","heartbeat_front_addr":"(unrecognized address family 0)/0","state":["exists","new"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":0,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":0,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":0,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"isa","technique":"reed_sol_van"}},"removed_snaps_queue":[],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-23T17:30:40.640 INFO:tasks.ceph.ceph_manager.ceph:[] 2026-03-23T17:30:40.640 INFO:tasks.ceph:Waiting for OSDs to come up 2026-03-23T17:30:40.837 INFO:tasks.ceph.osd.1.vm04.stderr:2026-03-23T17:30:40.834+0000 7f803e0a9a40 -1 Falling back to public interface 2026-03-23T17:30:40.837 INFO:tasks.ceph.osd.0.vm04.stderr:2026-03-23T17:30:40.834+0000 7f2fe46c5a40 -1 Falling back to public interface 2026-03-23T17:30:40.860 INFO:tasks.ceph.osd.2.vm04.stderr:2026-03-23T17:30:40.858+0000 7f9fac875a40 -1 Falling back to public interface 2026-03-23T17:30:40.942 DEBUG:teuthology.orchestra.run.vm04:> adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph --cluster ceph osd dump --format=json 2026-03-23T17:30:41.077 INFO:tasks.ceph.mgr.x.vm04.stderr:/usr/lib/python3/dist-packages/scipy/__init__.py:67: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode. 2026-03-23T17:30:41.077 INFO:tasks.ceph.mgr.x.vm04.stderr:Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve. 2026-03-23T17:30:41.077 INFO:tasks.ceph.mgr.x.vm04.stderr: from numpy import show_config as show_numpy_config 2026-03-23T17:30:41.212 INFO:teuthology.misc.health.vm04.stderr:2026-03-23T17:30:41.206+0000 7f7baa32a640 0 --2- 192.168.123.104:0/2999479627 >> v2:192.168.123.104:3300/0 conn(0x7f7ba405a570 0x7f7ba405a940 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 2026-03-23T17:30:41.333 INFO:teuthology.misc.health.vm04.stdout: 2026-03-23T17:30:41.333 INFO:teuthology.misc.health.vm04.stdout:{"epoch":5,"fsid":"bd8fd485-7f5e-4dd1-a4fb-2f27337796a2","created":"2026-03-23T17:30:39.647861+0000","modified":"2026-03-23T17:30:40.456871+0000","last_up_change":"0.000000","last_in_change":"2026-03-23T17:30:40.456871+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":2,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":0,"max_osd":3,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"tentacle","allow_crimson":false,"pools":[],"osds":[{"osd":0,"uuid":"d91b1ac9-820d-4f5b-83d6-7a015ca26296","up":0,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":0,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[]},"cluster_addrs":{"addrvec":[]},"heartbeat_back_addrs":{"addrvec":[]},"heartbeat_front_addrs":{"addrvec":[]},"public_addr":"(unrecognized address family 0)/0","cluster_addr":"(unrecognized address family 0)/0","heartbeat_back_addr":"(unrecognized address family 0)/0","heartbeat_front_addr":"(unrecognized address family 0)/0","state":["exists","new"]},{"osd":1,"uuid":"b630c322-76af-4a70-9b41-7c22a53ab1d6","up":0,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":0,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[]},"cluster_addrs":{"addrvec":[]},"heartbeat_back_addrs":{"addrvec":[]},"heartbeat_front_addrs":{"addrvec":[]},"public_addr":"(unrecognized address family 0)/0","cluster_addr":"(unrecognized address family 0)/0","heartbeat_back_addr":"(unrecognized address family 0)/0","heartbeat_front_addr":"(unrecognized address family 0)/0","state":["exists","new"]},{"osd":2,"uuid":"d8611821-f18a-4353-a7f8-27b1691155ff","up":0,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":0,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[]},"cluster_addrs":{"addrvec":[]},"heartbeat_back_addrs":{"addrvec":[]},"heartbeat_front_addrs":{"addrvec":[]},"public_addr":"(unrecognized address family 0)/0","cluster_addr":"(unrecognized address family 0)/0","heartbeat_back_addr":"(unrecognized address family 0)/0","heartbeat_front_addr":"(unrecognized address family 0)/0","state":["exists","new"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":0,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":0,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":0,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"isa","technique":"reed_sol_van"}},"removed_snaps_queue":[],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-23T17:30:41.348 DEBUG:teuthology.misc:0 of 3 OSDs are up 2026-03-23T17:30:41.660 INFO:tasks.ceph.osd.0.vm04.stderr:2026-03-23T17:30:41.654+0000 7f2fe46c5a40 -1 osd.0 0 log_to_monitors true 2026-03-23T17:30:41.661 INFO:tasks.ceph.osd.2.vm04.stderr:2026-03-23T17:30:41.658+0000 7f9fac875a40 -1 osd.2 0 log_to_monitors true 2026-03-23T17:30:41.668 INFO:tasks.ceph.osd.1.vm04.stderr:2026-03-23T17:30:41.662+0000 7f803e0a9a40 -1 osd.1 0 log_to_monitors true 2026-03-23T17:30:41.857 INFO:tasks.ceph.mgr.x.vm04.stderr:Failed to import NVMeoFClient and related components: cannot import name 'NVMeoFClient' from 'dashboard.services.nvmeof_client' (/usr/share/ceph/mgr/dashboard/services/nvmeof_client.py) 2026-03-23T17:30:42.852 INFO:tasks.ceph.osd.0.vm04.stderr:2026-03-23T17:30:42.850+0000 7f2fe066e640 -1 osd.0 0 waiting for initial osdmap 2026-03-23T17:30:42.852 INFO:tasks.ceph.osd.1.vm04.stderr:2026-03-23T17:30:42.850+0000 7f803a052640 -1 osd.1 0 waiting for initial osdmap 2026-03-23T17:30:42.855 INFO:tasks.ceph.osd.1.vm04.stderr:2026-03-23T17:30:42.850+0000 7f8034e60640 -1 osd.1 7 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-23T17:30:42.856 INFO:tasks.ceph.osd.0.vm04.stderr:2026-03-23T17:30:42.850+0000 7f2fdb47c640 -1 osd.0 7 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-23T17:30:42.860 INFO:tasks.ceph.osd.2.vm04.stderr:2026-03-23T17:30:42.858+0000 7f9fa881e640 -1 osd.2 0 waiting for initial osdmap 2026-03-23T17:30:42.863 INFO:tasks.ceph.osd.2.vm04.stderr:2026-03-23T17:30:42.858+0000 7f9fa362c640 -1 osd.2 7 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-23T17:30:43.466 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-23T17:30:43.462+0000 7f9fcd435640 -1 mgr.server handle_report got status from non-daemon mon.a 2026-03-23T17:30:47.651 DEBUG:teuthology.orchestra.run.vm04:> adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph --cluster ceph osd dump --format=json 2026-03-23T17:30:47.837 INFO:teuthology.misc.health.vm04.stdout: 2026-03-23T17:30:47.838 INFO:teuthology.misc.health.vm04.stdout:{"epoch":11,"fsid":"bd8fd485-7f5e-4dd1-a4fb-2f27337796a2","created":"2026-03-23T17:30:39.647861+0000","modified":"2026-03-23T17:30:47.474729+0000","last_up_change":"2026-03-23T17:30:43.838380+0000","last_in_change":"2026-03-23T17:30:40.456871+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":4,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":1,"max_osd":3,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"tentacle","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-23T17:30:44.476796+0000","flags":1,"flags_names":"hashpspool","type":1,"size":2,"min_size":1,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"is_stretch_pool":false,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"11","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"nonprimary_shards":"{}","options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_type":"Fair distribution","score_acting":2.9900000095367432,"score_stable":2.9900000095367432,"optimal_score":0.67000001668930054,"raw_score_acting":2,"raw_score_stable":2,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"d91b1ac9-820d-4f5b-83d6-7a015ca26296","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6802","nonce":2398517092},{"type":"v1","addr":"192.168.123.104:6804","nonce":2398517092}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6806","nonce":2398517092},{"type":"v1","addr":"192.168.123.104:6808","nonce":2398517092}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6814","nonce":2398517092},{"type":"v1","addr":"192.168.123.104:6815","nonce":2398517092}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6810","nonce":2398517092},{"type":"v1","addr":"192.168.123.104:6812","nonce":2398517092}]},"public_addr":"192.168.123.104:6804/2398517092","cluster_addr":"192.168.123.104:6808/2398517092","heartbeat_back_addr":"192.168.123.104:6815/2398517092","heartbeat_front_addr":"192.168.123.104:6812/2398517092","state":["exists","up"]},{"osd":1,"uuid":"b630c322-76af-4a70-9b41-7c22a53ab1d6","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":9,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6800","nonce":3728786032},{"type":"v1","addr":"192.168.123.104:6801","nonce":3728786032}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6803","nonce":3728786032},{"type":"v1","addr":"192.168.123.104:6805","nonce":3728786032}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6811","nonce":3728786032},{"type":"v1","addr":"192.168.123.104:6813","nonce":3728786032}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6807","nonce":3728786032},{"type":"v1","addr":"192.168.123.104:6809","nonce":3728786032}]},"public_addr":"192.168.123.104:6801/3728786032","cluster_addr":"192.168.123.104:6805/3728786032","heartbeat_back_addr":"192.168.123.104:6813/3728786032","heartbeat_front_addr":"192.168.123.104:6809/3728786032","state":["exists","up"]},{"osd":2,"uuid":"d8611821-f18a-4353-a7f8-27b1691155ff","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6816","nonce":2446644018},{"type":"v1","addr":"192.168.123.104:6817","nonce":2446644018}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6818","nonce":2446644018},{"type":"v1","addr":"192.168.123.104:6819","nonce":2446644018}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6822","nonce":2446644018},{"type":"v1","addr":"192.168.123.104:6823","nonce":2446644018}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6820","nonce":2446644018},{"type":"v1","addr":"192.168.123.104:6821","nonce":2446644018}]},"public_addr":"192.168.123.104:6817/2446644018","cluster_addr":"192.168.123.104:6819/2446644018","heartbeat_back_addr":"192.168.123.104:6823/2446644018","heartbeat_front_addr":"192.168.123.104:6821/2446644018","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"isa","technique":"reed_sol_van"}},"removed_snaps_queue":[],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-23T17:30:47.853 DEBUG:teuthology.misc:3 of 3 OSDs are up 2026-03-23T17:30:47.853 INFO:tasks.ceph:Creating RBD pool 2026-03-23T17:30:47.853 DEBUG:teuthology.orchestra.run.vm04:> sudo ceph --cluster ceph osd pool create rbd 8 2026-03-23T17:30:48.488 INFO:teuthology.orchestra.run.vm04.stderr:pool 'rbd' created 2026-03-23T17:30:48.511 DEBUG:teuthology.orchestra.run.vm04:> rbd --cluster ceph pool init rbd 2026-03-23T17:30:51.700 INFO:tasks.ceph:Starting mds daemons in cluster ceph... 2026-03-23T17:30:51.700 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph config log 1 --format=json 2026-03-23T17:30:51.700 INFO:tasks.daemonwatchdog.daemon_watchdog:watchdog starting 2026-03-23T17:30:51.892 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:51.907 INFO:teuthology.orchestra.run.vm04.stdout:[{"version":1,"timestamp":"0.000000","name":"","changes":[]}] 2026-03-23T17:30:51.907 INFO:tasks.ceph_manager:config epoch is 1 2026-03-23T17:30:51.907 INFO:tasks.ceph:Waiting until ceph daemons up and pgs clean... 2026-03-23T17:30:51.907 INFO:tasks.ceph.ceph_manager.ceph:waiting for mgr available 2026-03-23T17:30:51.907 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph mgr dump --format=json 2026-03-23T17:30:52.113 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:52.130 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":5,"flags":0,"active_gid":4104,"active_name":"x","active_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6824","nonce":491281152},{"type":"v1","addr":"192.168.123.104:6825","nonce":491281152}]},"active_addr":"192.168.123.104:6825/491281152","active_change":"2026-03-23T17:30:42.454991+0000","active_mgr_features":4544132024016699391,"available":true,"standbys":[],"modules":["iostat","nfs"],"available_modules":[{"name":"alerts","can_run":true,"error_string":"","module_options":{"interval":{"name":"interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"How frequently to reexamine health status","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"smtp_destination":{"name":"smtp_destination","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Email address to send alerts to, use commas to separate multiple","long_desc":"","tags":[],"see_also":[]},"smtp_from_name":{"name":"smtp_from_name","type":"str","level":"advanced","flags":1,"default_value":"Ceph","min":"","max":"","enum_allowed":[],"desc":"Email From: name","long_desc":"","tags":[],"see_also":[]},"smtp_host":{"name":"smtp_host","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_password":{"name":"smtp_password","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Password to authenticate with","long_desc":"","tags":[],"see_also":[]},"smtp_port":{"name":"smtp_port","type":"int","level":"advanced","flags":1,"default_value":"465","min":"","max":"","enum_allowed":[],"desc":"SMTP port","long_desc":"","tags":[],"see_also":[]},"smtp_sender":{"name":"smtp_sender","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP envelope sender","long_desc":"","tags":[],"see_also":[]},"smtp_ssl":{"name":"smtp_ssl","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Use SSL to connect to SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_user":{"name":"smtp_user","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"User to authenticate as","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"balancer","can_run":true,"error_string":"","module_options":{"active":{"name":"active","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"automatically balance PGs across cluster","long_desc":"","tags":[],"see_also":[]},"begin_time":{"name":"begin_time","type":"str","level":"advanced","flags":1,"default_value":"0000","min":"","max":"","enum_allowed":[],"desc":"beginning time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"begin_weekday":{"name":"begin_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to this day of the week or later","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"crush_compat_max_iterations":{"name":"crush_compat_max_iterations","type":"uint","level":"advanced","flags":1,"default_value":"25","min":"1","max":"250","enum_allowed":[],"desc":"maximum number of iterations to attempt optimization","long_desc":"","tags":[],"see_also":[]},"crush_compat_metrics":{"name":"crush_compat_metrics","type":"str","level":"advanced","flags":1,"default_value":"pgs,objects,bytes","min":"","max":"","enum_allowed":[],"desc":"metrics with which to calculate OSD utilization","long_desc":"Value is a list of one or more of \"pgs\", \"objects\", or \"bytes\", and indicates which metrics to use to balance utilization.","tags":[],"see_also":[]},"crush_compat_step":{"name":"crush_compat_step","type":"float","level":"advanced","flags":1,"default_value":"0.5","min":"0.001","max":"0.999","enum_allowed":[],"desc":"aggressiveness of optimization","long_desc":".99 is very aggressive, .01 is less aggressive","tags":[],"see_also":[]},"end_time":{"name":"end_time","type":"str","level":"advanced","flags":1,"default_value":"2359","min":"","max":"","enum_allowed":[],"desc":"ending time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"end_weekday":{"name":"end_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to days of the week earlier than this","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_score":{"name":"min_score","type":"float","level":"advanced","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"minimum score, below which no optimization is attempted","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":1,"default_value":"upmap","min":"","max":"","enum_allowed":["crush-compat","none","read","upmap","upmap-read"],"desc":"Balancer mode","long_desc":"","tags":[],"see_also":[]},"pool_ids":{"name":"pool_ids","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"pools which the automatic balancing will be limited to","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and attempt optimization","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"update_pg_upmap_activity":{"name":"update_pg_upmap_activity","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Updates pg_upmap activity stats to be used in `balancer status detail`","long_desc":"","tags":[],"see_also":[]},"upmap_max_deviation":{"name":"upmap_max_deviation","type":"int","level":"advanced","flags":1,"default_value":"5","min":"1","max":"","enum_allowed":[],"desc":"deviation below which no optimization is attempted","long_desc":"If the number of PGs are within this count then no optimization is attempted","tags":[],"see_also":[]},"upmap_max_optimizations":{"name":"upmap_max_optimizations","type":"uint","level":"advanced","flags":1,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"maximum upmap optimizations to make per attempt","long_desc":"","tags":[],"see_also":[]}}},{"name":"cephadm","can_run":true,"error_string":"","module_options":{"agent_down_multiplier":{"name":"agent_down_multiplier","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"","max":"","enum_allowed":[],"desc":"Multiplied by agent refresh rate to calculate how long agent must not report before being marked down","long_desc":"","tags":[],"see_also":[]},"agent_refresh_rate":{"name":"agent_refresh_rate","type":"secs","level":"advanced","flags":0,"default_value":"20","min":"","max":"","enum_allowed":[],"desc":"How often agent on each host will try to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"agent_starting_port":{"name":"agent_starting_port","type":"int","level":"advanced","flags":0,"default_value":"4721","min":"","max":"","enum_allowed":[],"desc":"First port agent will try to bind to (will also try up to next 1000 subsequent ports if blocked)","long_desc":"","tags":[],"see_also":[]},"allow_ptrace":{"name":"allow_ptrace","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow SYS_PTRACE capability on ceph containers","long_desc":"The SYS_PTRACE capability is needed to attach to a process with gdb or strace. Enabling this options can allow debugging daemons that encounter problems at runtime.","tags":[],"see_also":[]},"autotune_interval":{"name":"autotune_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to autotune daemon memory","long_desc":"","tags":[],"see_also":[]},"autotune_memory_target_ratio":{"name":"autotune_memory_target_ratio","type":"float","level":"advanced","flags":0,"default_value":"0.7","min":"","max":"","enum_allowed":[],"desc":"ratio of total system memory to divide amongst autotuned daemons","long_desc":"","tags":[],"see_also":[]},"cephadm_log_destination":{"name":"cephadm_log_destination","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":["file","file,syslog","syslog"],"desc":"Destination for cephadm command's persistent logging","long_desc":"","tags":[],"see_also":[]},"certificate_automated_rotation_enabled":{"name":"certificate_automated_rotation_enabled","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"This flag controls whether cephadm automatically rotates certificates upon expiration.","long_desc":"","tags":[],"see_also":[]},"certificate_check_debug_mode":{"name":"certificate_check_debug_mode","type":"bool","level":"dev","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"FOR TESTING ONLY: This flag forces the certificate check instead of waiting for certificate_check_period.","long_desc":"","tags":[],"see_also":[]},"certificate_check_period":{"name":"certificate_check_period","type":"int","level":"advanced","flags":0,"default_value":"1","min":"0","max":"30","enum_allowed":[],"desc":"Specifies how often (in days) the certificate should be checked for validity.","long_desc":"","tags":[],"see_also":[]},"certificate_duration_days":{"name":"certificate_duration_days","type":"int","level":"advanced","flags":0,"default_value":"1095","min":"90","max":"3650","enum_allowed":[],"desc":"Specifies the duration of self certificates generated and signed by cephadm root CA","long_desc":"","tags":[],"see_also":[]},"certificate_renewal_threshold_days":{"name":"certificate_renewal_threshold_days","type":"int","level":"advanced","flags":0,"default_value":"30","min":"10","max":"90","enum_allowed":[],"desc":"Specifies the lead time in days to initiate certificate renewal before expiration.","long_desc":"","tags":[],"see_also":[]},"cgroups_split":{"name":"cgroups_split","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Pass --cgroups=split when cephadm creates containers (currently podman only)","long_desc":"","tags":[],"see_also":[]},"config_checks_enabled":{"name":"config_checks_enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable or disable the cephadm configuration analysis","long_desc":"","tags":[],"see_also":[]},"config_dashboard":{"name":"config_dashboard","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"manage configs like API endpoints in Dashboard.","long_desc":"","tags":[],"see_also":[]},"container_image_alertmanager":{"name":"container_image_alertmanager","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/alertmanager:v0.28.1","min":"","max":"","enum_allowed":[],"desc":"Alertmanager container image","long_desc":"","tags":[],"see_also":[]},"container_image_base":{"name":"container_image_base","type":"str","level":"advanced","flags":1,"default_value":"quay.io/ceph/ceph","min":"","max":"","enum_allowed":[],"desc":"Container image name, without the tag","long_desc":"","tags":[],"see_also":[]},"container_image_elasticsearch":{"name":"container_image_elasticsearch","type":"str","level":"advanced","flags":0,"default_value":"quay.io/omrizeneva/elasticsearch:6.8.23","min":"","max":"","enum_allowed":[],"desc":"Elasticsearch container image","long_desc":"","tags":[],"see_also":[]},"container_image_grafana":{"name":"container_image_grafana","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/grafana:12.3.1","min":"","max":"","enum_allowed":[],"desc":"Grafana container image","long_desc":"","tags":[],"see_also":[]},"container_image_haproxy":{"name":"container_image_haproxy","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/haproxy:2.3","min":"","max":"","enum_allowed":[],"desc":"Haproxy container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_agent":{"name":"container_image_jaeger_agent","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-agent:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger agent container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_collector":{"name":"container_image_jaeger_collector","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-collector:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger collector container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_query":{"name":"container_image_jaeger_query","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-query:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger query container image","long_desc":"","tags":[],"see_also":[]},"container_image_keepalived":{"name":"container_image_keepalived","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/keepalived:2.2.4","min":"","max":"","enum_allowed":[],"desc":"Keepalived container image","long_desc":"","tags":[],"see_also":[]},"container_image_loki":{"name":"container_image_loki","type":"str","level":"advanced","flags":0,"default_value":"docker.io/grafana/loki:3.0.0","min":"","max":"","enum_allowed":[],"desc":"Loki container image","long_desc":"","tags":[],"see_also":[]},"container_image_nginx":{"name":"container_image_nginx","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/nginx:sclorg-nginx-126","min":"","max":"","enum_allowed":[],"desc":"Nginx container image","long_desc":"","tags":[],"see_also":[]},"container_image_node_exporter":{"name":"container_image_node_exporter","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/node-exporter:v1.9.1","min":"","max":"","enum_allowed":[],"desc":"Node exporter container image","long_desc":"","tags":[],"see_also":[]},"container_image_nvmeof":{"name":"container_image_nvmeof","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/nvmeof:1.5","min":"","max":"","enum_allowed":[],"desc":"Nvmeof container image","long_desc":"","tags":[],"see_also":[]},"container_image_oauth2_proxy":{"name":"container_image_oauth2_proxy","type":"str","level":"advanced","flags":0,"default_value":"quay.io/oauth2-proxy/oauth2-proxy:v7.6.0","min":"","max":"","enum_allowed":[],"desc":"Oauth2 proxy container image","long_desc":"","tags":[],"see_also":[]},"container_image_prometheus":{"name":"container_image_prometheus","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/prometheus:v3.6.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_promtail":{"name":"container_image_promtail","type":"str","level":"advanced","flags":0,"default_value":"docker.io/grafana/promtail:3.0.0","min":"","max":"","enum_allowed":[],"desc":"Promtail container image","long_desc":"","tags":[],"see_also":[]},"container_image_samba":{"name":"container_image_samba","type":"str","level":"advanced","flags":0,"default_value":"quay.io/samba.org/samba-server:ceph20-centos-amd64","min":"","max":"","enum_allowed":[],"desc":"Samba container image","long_desc":"","tags":[],"see_also":[]},"container_image_samba_metrics":{"name":"container_image_samba_metrics","type":"str","level":"advanced","flags":0,"default_value":"quay.io/samba.org/samba-metrics:ceph20-centos-amd64","min":"","max":"","enum_allowed":[],"desc":"Samba metrics container image","long_desc":"","tags":[],"see_also":[]},"container_image_snmp_gateway":{"name":"container_image_snmp_gateway","type":"str","level":"advanced","flags":0,"default_value":"docker.io/maxwo/snmp-notifier:v1.2.1","min":"","max":"","enum_allowed":[],"desc":"Snmp gateway container image","long_desc":"","tags":[],"see_also":[]},"container_init":{"name":"container_init","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Run podman/docker with `--init`","long_desc":"","tags":[],"see_also":[]},"daemon_cache_timeout":{"name":"daemon_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"seconds to cache service (daemon) inventory","long_desc":"","tags":[],"see_also":[]},"default_cephadm_command_timeout":{"name":"default_cephadm_command_timeout","type":"int","level":"advanced","flags":0,"default_value":"900","min":"","max":"","enum_allowed":[],"desc":"Default timeout applied to cephadm commands run directly on the host (in seconds)","long_desc":"","tags":[],"see_also":[]},"default_registry":{"name":"default_registry","type":"str","level":"advanced","flags":0,"default_value":"quay.io","min":"","max":"","enum_allowed":[],"desc":"Search-registry to which we should normalize unqualified image names. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"device_cache_timeout":{"name":"device_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"1800","min":"","max":"","enum_allowed":[],"desc":"seconds to cache device inventory","long_desc":"","tags":[],"see_also":[]},"device_enhanced_scan":{"name":"device_enhanced_scan","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use libstoragemgmt during device scans","long_desc":"","tags":[],"see_also":[]},"facts_cache_timeout":{"name":"facts_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"seconds to cache host facts data","long_desc":"","tags":[],"see_also":[]},"grafana_dashboards_path":{"name":"grafana_dashboards_path","type":"str","level":"advanced","flags":0,"default_value":"/etc/grafana/dashboards/ceph-dashboard/","min":"","max":"","enum_allowed":[],"desc":"location of dashboards to include in grafana deployments","long_desc":"","tags":[],"see_also":[]},"host_check_interval":{"name":"host_check_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to perform a host check","long_desc":"","tags":[],"see_also":[]},"hw_monitoring":{"name":"hw_monitoring","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Deploy hw monitoring daemon on every host.","long_desc":"","tags":[],"see_also":[]},"inventory_list_all":{"name":"inventory_list_all","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Whether ceph-volume inventory should report more devices (mostly mappers (LVs / mpaths), partitions...)","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_refresh_metadata":{"name":"log_refresh_metadata","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Log all refresh metadata. Includes daemon, device, and host info collected regularly. Only has effect if logging at debug level","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"log to the \"cephadm\" cluster log channel\"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf":{"name":"manage_etc_ceph_ceph_conf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Manage and own /etc/ceph/ceph.conf on the hosts.","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf_hosts":{"name":"manage_etc_ceph_ceph_conf_hosts","type":"str","level":"advanced","flags":0,"default_value":"*","min":"","max":"","enum_allowed":[],"desc":"PlacementSpec describing on which hosts to manage /etc/ceph/ceph.conf","long_desc":"","tags":[],"see_also":[]},"max_count_per_host":{"name":"max_count_per_host","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of daemons per service per host","long_desc":"","tags":[],"see_also":[]},"max_osd_draining_count":{"name":"max_osd_draining_count","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of osds that will be drained simultaneously when osds are removed","long_desc":"","tags":[],"see_also":[]},"migration_current":{"name":"migration_current","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"internal - do not modify","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":0,"default_value":"root","min":"","max":"","enum_allowed":["cephadm-package","root"],"desc":"mode for remote execution of cephadm","long_desc":"","tags":[],"see_also":[]},"oob_default_addr":{"name":"oob_default_addr","type":"str","level":"advanced","flags":0,"default_value":"169.254.1.1","min":"","max":"","enum_allowed":[],"desc":"Default address for RedFish API (oob management).","long_desc":"","tags":[],"see_also":[]},"prometheus_alerts_path":{"name":"prometheus_alerts_path","type":"str","level":"advanced","flags":0,"default_value":"/etc/prometheus/ceph/ceph_default_alerts.yml","min":"","max":"","enum_allowed":[],"desc":"location of alerts to include in prometheus deployments","long_desc":"","tags":[],"see_also":[]},"registry_insecure":{"name":"registry_insecure","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Registry is to be considered insecure (no TLS available). Only for development purposes.","long_desc":"","tags":[],"see_also":[]},"registry_password":{"name":"registry_password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository password. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"registry_url":{"name":"registry_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Registry url for login purposes. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"registry_username":{"name":"registry_username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository username. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"secure_monitoring_stack":{"name":"secure_monitoring_stack","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable TLS security for all the monitoring stack daemons","long_desc":"","tags":[],"see_also":[]},"service_discovery_port":{"name":"service_discovery_port","type":"int","level":"advanced","flags":0,"default_value":"8765","min":"","max":"","enum_allowed":[],"desc":"cephadm service discovery port","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssh_config_file":{"name":"ssh_config_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"customized SSH config file to connect to managed hosts","long_desc":"","tags":[],"see_also":[]},"ssh_keepalive_count_max":{"name":"ssh_keepalive_count_max","type":"int","level":"advanced","flags":0,"default_value":"3","min":"","max":"","enum_allowed":[],"desc":"How many times ssh connections can fail liveness checks before the host is marked offline","long_desc":"","tags":[],"see_also":[]},"ssh_keepalive_interval":{"name":"ssh_keepalive_interval","type":"int","level":"advanced","flags":0,"default_value":"7","min":"","max":"","enum_allowed":[],"desc":"How often ssh connections are checked for liveness","long_desc":"","tags":[],"see_also":[]},"stray_daemon_check_interval":{"name":"stray_daemon_check_interval","type":"secs","level":"advanced","flags":0,"default_value":"1800","min":"","max":"","enum_allowed":[],"desc":"how frequently cephadm should check for the presence of stray daemons","long_desc":"","tags":[],"see_also":[]},"use_agent":{"name":"use_agent","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use cephadm agent on each host to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"use_repo_digest":{"name":"use_repo_digest","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Automatically convert image tags to image digest. Make sure all daemons use the same image","long_desc":"","tags":[],"see_also":[]},"warn_on_failed_host_check":{"name":"warn_on_failed_host_check","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if the host check fails","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_daemons":{"name":"warn_on_stray_daemons","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected that are not managed by cephadm","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_hosts":{"name":"warn_on_stray_hosts","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected on a host that is not managed by cephadm","long_desc":"","tags":[],"see_also":[]}}},{"name":"crash","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"retain_interval":{"name":"retain_interval","type":"secs","level":"advanced","flags":1,"default_value":"31536000","min":"","max":"","enum_allowed":[],"desc":"how long to retain crashes before pruning them","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"warn_recent_interval":{"name":"warn_recent_interval","type":"secs","level":"advanced","flags":1,"default_value":"1209600","min":"","max":"","enum_allowed":[],"desc":"time interval in which to warn about recent crashes","long_desc":"","tags":[],"see_also":[]}}},{"name":"dashboard","can_run":true,"error_string":"","module_options":{"ACCOUNT_LOCKOUT_ATTEMPTS":{"name":"ACCOUNT_LOCKOUT_ATTEMPTS","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_HOST":{"name":"ALERTMANAGER_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_SSL_VERIFY":{"name":"ALERTMANAGER_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_ENABLED":{"name":"AUDIT_API_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_LOG_PAYLOAD":{"name":"AUDIT_API_LOG_PAYLOAD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ENABLE_BROWSABLE_API":{"name":"ENABLE_BROWSABLE_API","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_CEPHFS":{"name":"FEATURE_TOGGLE_CEPHFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_DASHBOARD":{"name":"FEATURE_TOGGLE_DASHBOARD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_ISCSI":{"name":"FEATURE_TOGGLE_ISCSI","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_MIRRORING":{"name":"FEATURE_TOGGLE_MIRRORING","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_NFS":{"name":"FEATURE_TOGGLE_NFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RBD":{"name":"FEATURE_TOGGLE_RBD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RGW":{"name":"FEATURE_TOGGLE_RGW","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE":{"name":"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_PASSWORD":{"name":"GRAFANA_API_PASSWORD","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_SSL_VERIFY":{"name":"GRAFANA_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_URL":{"name":"GRAFANA_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_USERNAME":{"name":"GRAFANA_API_USERNAME","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_FRONTEND_API_URL":{"name":"GRAFANA_FRONTEND_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_UPDATE_DASHBOARDS":{"name":"GRAFANA_UPDATE_DASHBOARDS","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISCSI_API_SSL_VERIFICATION":{"name":"ISCSI_API_SSL_VERIFICATION","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISSUE_TRACKER_API_KEY":{"name":"ISSUE_TRACKER_API_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"MANAGED_BY_CLUSTERS":{"name":"MANAGED_BY_CLUSTERS","type":"str","level":"advanced","flags":0,"default_value":"[]","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"MULTICLUSTER_CONFIG":{"name":"MULTICLUSTER_CONFIG","type":"str","level":"advanced","flags":0,"default_value":"{}","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_HOST":{"name":"PROMETHEUS_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_SSL_VERIFY":{"name":"PROMETHEUS_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROM_ALERT_CREDENTIAL_CACHE_TTL":{"name":"PROM_ALERT_CREDENTIAL_CACHE_TTL","type":"int","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_COMPLEXITY_ENABLED":{"name":"PWD_POLICY_CHECK_COMPLEXITY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED":{"name":"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_LENGTH_ENABLED":{"name":"PWD_POLICY_CHECK_LENGTH_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_OLDPWD_ENABLED":{"name":"PWD_POLICY_CHECK_OLDPWD_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_USERNAME_ENABLED":{"name":"PWD_POLICY_CHECK_USERNAME_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_ENABLED":{"name":"PWD_POLICY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_EXCLUSION_LIST":{"name":"PWD_POLICY_EXCLUSION_LIST","type":"str","level":"advanced","flags":0,"default_value":"osd,host,dashboard,pool,block,nfs,ceph,monitors,gateway,logs,crush,maps","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_COMPLEXITY":{"name":"PWD_POLICY_MIN_COMPLEXITY","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_LENGTH":{"name":"PWD_POLICY_MIN_LENGTH","type":"int","level":"advanced","flags":0,"default_value":"8","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"REST_REQUESTS_TIMEOUT":{"name":"REST_REQUESTS_TIMEOUT","type":"int","level":"advanced","flags":0,"default_value":"45","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ACCESS_KEY":{"name":"RGW_API_ACCESS_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ADMIN_RESOURCE":{"name":"RGW_API_ADMIN_RESOURCE","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SECRET_KEY":{"name":"RGW_API_SECRET_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SSL_VERIFY":{"name":"RGW_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_HOSTNAME_PER_DAEMON":{"name":"RGW_HOSTNAME_PER_DAEMON","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"UNSAFE_TLS_v1_2":{"name":"UNSAFE_TLS_v1_2","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_SPAN":{"name":"USER_PWD_EXPIRATION_SPAN","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_1":{"name":"USER_PWD_EXPIRATION_WARNING_1","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_2":{"name":"USER_PWD_EXPIRATION_WARNING_2","type":"int","level":"advanced","flags":0,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"cross_origin_url":{"name":"cross_origin_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"crt_file":{"name":"crt_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"crypto_caller":{"name":"crypto_caller","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"debug":{"name":"debug","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable/disable debug options","long_desc":"","tags":[],"see_also":[]},"jwt_token_ttl":{"name":"jwt_token_ttl","type":"int","level":"advanced","flags":0,"default_value":"28800","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"key_file":{"name":"key_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"motd":{"name":"motd","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"The message of the day","long_desc":"","tags":[],"see_also":[]},"redirect_resolve_ip_addr":{"name":"redirect_resolve_ip_addr","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":0,"default_value":"8080","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl_server_port":{"name":"ssl_server_port","type":"int","level":"advanced","flags":0,"default_value":"8443","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sso_oauth2":{"name":"sso_oauth2","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":0,"default_value":"redirect","min":"","max":"","enum_allowed":["error","redirect"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":0,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url_prefix":{"name":"url_prefix","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"devicehealth","can_run":true,"error_string":"","module_options":{"enable_monitoring":{"name":"enable_monitoring","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"monitor device health metrics","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mark_out_threshold":{"name":"mark_out_threshold","type":"secs","level":"advanced","flags":1,"default_value":"2419200","min":"","max":"","enum_allowed":[],"desc":"automatically mark OSD if it may fail before this long","long_desc":"","tags":[],"see_also":[]},"pool_name":{"name":"pool_name","type":"str","level":"advanced","flags":1,"default_value":"device_health_metrics","min":"","max":"","enum_allowed":[],"desc":"name of pool in which to store device health metrics","long_desc":"","tags":[],"see_also":[]},"retention_period":{"name":"retention_period","type":"secs","level":"advanced","flags":1,"default_value":"15552000","min":"","max":"","enum_allowed":[],"desc":"how long to retain device health metrics","long_desc":"","tags":[],"see_also":[]},"scrape_frequency":{"name":"scrape_frequency","type":"secs","level":"advanced","flags":1,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"how frequently to scrape device health metrics","long_desc":"","tags":[],"see_also":[]},"self_heal":{"name":"self_heal","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"preemptively heal cluster around devices that may fail","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and check device health","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"warn_threshold":{"name":"warn_threshold","type":"secs","level":"advanced","flags":1,"default_value":"7257600","min":"","max":"","enum_allowed":[],"desc":"raise health warning if OSD may fail before this long","long_desc":"","tags":[],"see_also":[]}}},{"name":"diskprediction_local","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predict_interval":{"name":"predict_interval","type":"str","level":"advanced","flags":0,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predictor_model":{"name":"predictor_model","type":"str","level":"advanced","flags":0,"default_value":"prophetstor","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"str","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"influx","can_run":false,"error_string":"influxdb python module not found","module_options":{"batch_size":{"name":"batch_size","type":"int","level":"advanced","flags":0,"default_value":"5000","min":"","max":"","enum_allowed":[],"desc":"How big batches of data points should be when sending to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"database":{"name":"database","type":"str","level":"advanced","flags":0,"default_value":"ceph","min":"","max":"","enum_allowed":[],"desc":"InfluxDB database name. You will need to create this database and grant write privileges to the configured username or the username must have admin privileges to create it.","long_desc":"","tags":[],"see_also":[]},"hostname":{"name":"hostname","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server hostname","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"30","min":"5","max":"","enum_allowed":[],"desc":"Time between reports to InfluxDB. Default 30 seconds.","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"password":{"name":"password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"password of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"port":{"name":"port","type":"int","level":"advanced","flags":0,"default_value":"8086","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server port","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"str","level":"advanced","flags":0,"default_value":"false","min":"","max":"","enum_allowed":[],"desc":"Use https connection for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]},"threads":{"name":"threads","type":"int","level":"advanced","flags":0,"default_value":"5","min":"1","max":"32","enum_allowed":[],"desc":"How many worker threads should be spawned for sending data to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"username":{"name":"username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"username of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"verify_ssl":{"name":"verify_ssl","type":"str","level":"advanced","flags":0,"default_value":"true","min":"","max":"","enum_allowed":[],"desc":"Verify https cert for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]}}},{"name":"insights","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"iostat","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"localpool","can_run":true,"error_string":"","module_options":{"failure_domain":{"name":"failure_domain","type":"str","level":"advanced","flags":1,"default_value":"host","min":"","max":"","enum_allowed":[],"desc":"failure domain for any created local pool","long_desc":"what failure domain we should separate data replicas across.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_size":{"name":"min_size","type":"int","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"default min_size for any created local pool","long_desc":"value to set min_size to (unchanged from Ceph's default if this option is not set)","tags":[],"see_also":[]},"num_rep":{"name":"num_rep","type":"int","level":"advanced","flags":1,"default_value":"3","min":"","max":"","enum_allowed":[],"desc":"default replica count for any created local pool","long_desc":"","tags":[],"see_also":[]},"pg_num":{"name":"pg_num","type":"int","level":"advanced","flags":1,"default_value":"128","min":"","max":"","enum_allowed":[],"desc":"default pg_num for any created local pool","long_desc":"","tags":[],"see_also":[]},"prefix":{"name":"prefix","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"name prefix for any created local pool","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"subtree":{"name":"subtree","type":"str","level":"advanced","flags":1,"default_value":"rack","min":"","max":"","enum_allowed":[],"desc":"CRUSH level for which to create a local pool","long_desc":"which CRUSH subtree type the module should create a pool for.","tags":[],"see_also":[]}}},{"name":"mirroring","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"nfs","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"orchestrator","can_run":true,"error_string":"","module_options":{"fail_fs":{"name":"fail_fs","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Fail filesystem for rapid multi-rank mds upgrade","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"orchestrator":{"name":"orchestrator","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["cephadm","rook","test_orchestrator"],"desc":"Orchestrator backend","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_perf_query","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"pg_autoscaler","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"threshold":{"name":"threshold","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"1.0","max":"","enum_allowed":[],"desc":"scaling threshold","long_desc":"The factor by which the `NEW PG_NUM` must vary from the current`PG_NUM` before being accepted. Cannot be less than 1.0","tags":[],"see_also":[]}}},{"name":"progress","can_run":true,"error_string":"","module_options":{"allow_pg_recovery_event":{"name":"allow_pg_recovery_event","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow the module to show pg recovery progress","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_completed_events":{"name":"max_completed_events","type":"int","level":"advanced","flags":1,"default_value":"50","min":"","max":"","enum_allowed":[],"desc":"number of past completed events to remember","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"how long the module is going to sleep","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"prometheus","can_run":true,"error_string":"","module_options":{"cache":{"name":"cache","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"exclude_perf_counters":{"name":"exclude_perf_counters","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Do not include perf-counters in the metrics output","long_desc":"Gathering perf-counters from a single Prometheus exporter can degrade ceph-mgr performance, especially in large clusters. Instead, Ceph-exporter daemons are now used by default for perf-counter gathering. This should only be disabled when no ceph-exporters are deployed.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools":{"name":"rbd_stats_pools","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools_refresh_interval":{"name":"rbd_stats_pools_refresh_interval","type":"int","level":"advanced","flags":0,"default_value":"300","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"scrape_interval":{"name":"scrape_interval","type":"float","level":"advanced","flags":0,"default_value":"15.0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"the IPv4 or IPv6 address on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":1,"default_value":"9283","min":"","max":"","enum_allowed":[],"desc":"the port on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"stale_cache_strategy":{"name":"stale_cache_strategy","type":"str","level":"advanced","flags":0,"default_value":"log","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":1,"default_value":"default","min":"","max":"","enum_allowed":["default","error"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":1,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rbd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_snap_create":{"name":"max_concurrent_snap_create","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mirror_snapshot_schedule":{"name":"mirror_snapshot_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"trash_purge_schedule":{"name":"trash_purge_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rgw","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"secondary_zone_period_retry_limit":{"name":"secondary_zone_period_retry_limit","type":"int","level":"advanced","flags":0,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"RGW module period update retry limit for secondary site","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"selftest","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption1":{"name":"roption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption2":{"name":"roption2","type":"str","level":"advanced","flags":0,"default_value":"xyz","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption1":{"name":"rwoption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption2":{"name":"rwoption2","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption3":{"name":"rwoption3","type":"float","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption4":{"name":"rwoption4","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption5":{"name":"rwoption5","type":"bool","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption6":{"name":"rwoption6","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption7":{"name":"rwoption7","type":"int","level":"advanced","flags":0,"default_value":"","min":"1","max":"42","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testkey":{"name":"testkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testlkey":{"name":"testlkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testnewline":{"name":"testnewline","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"snap_schedule","can_run":true,"error_string":"","module_options":{"allow_m_granularity":{"name":"allow_m_granularity","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow minute scheduled snapshots","long_desc":"","tags":[],"see_also":[]},"dump_on_update":{"name":"dump_on_update","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"dump database to debug log on update","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"stats","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"status","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telegraf","can_run":true,"error_string":"","module_options":{"address":{"name":"address","type":"str","level":"advanced","flags":0,"default_value":"unixgram:///tmp/telegraf.sock","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"15","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telemetry","can_run":true,"error_string":"","module_options":{"channel_basic":{"name":"channel_basic","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share basic cluster information (size, version)","long_desc":"","tags":[],"see_also":[]},"channel_crash":{"name":"channel_crash","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share metadata about Ceph daemon crashes (version, stack straces, etc)","long_desc":"","tags":[],"see_also":[]},"channel_device":{"name":"channel_device","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share device health metrics (e.g., SMART data, minus potentially identifying info like serial numbers)","long_desc":"","tags":[],"see_also":[]},"channel_ident":{"name":"channel_ident","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share a user-provided description and/or contact email for the cluster","long_desc":"","tags":[],"see_also":[]},"channel_perf":{"name":"channel_perf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share various performance metrics of a cluster","long_desc":"","tags":[],"see_also":[]},"contact":{"name":"contact","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"description":{"name":"description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"device_url":{"name":"device_url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/device","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"int","level":"advanced","flags":0,"default_value":"24","min":"8","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"last_opt_revision":{"name":"last_opt_revision","type":"int","level":"advanced","flags":0,"default_value":"1","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard":{"name":"leaderboard","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard_description":{"name":"leaderboard_description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"organization":{"name":"organization","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"proxy":{"name":"proxy","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url":{"name":"url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/report","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"test_orchestrator","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"volumes","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_clones":{"name":"max_concurrent_clones","type":"int","level":"advanced","flags":0,"default_value":"4","min":"","max":"","enum_allowed":[],"desc":"Number of asynchronous cloner threads","long_desc":"","tags":[],"see_also":[]},"pause_cloning":{"name":"pause_cloning","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Pause asynchronous cloner threads","long_desc":"","tags":[],"see_also":[]},"pause_purging":{"name":"pause_purging","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Pause asynchronous subvolume purge threads","long_desc":"","tags":[],"see_also":[]},"periodic_async_work":{"name":"periodic_async_work","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Periodically check for async work","long_desc":"","tags":[],"see_also":[]},"snapshot_clone_delay":{"name":"snapshot_clone_delay","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"Delay clone begin operation by snapshot_clone_delay seconds","long_desc":"","tags":[],"see_also":[]},"snapshot_clone_no_wait":{"name":"snapshot_clone_no_wait","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Reject subvolume clone request when cloner threads are busy","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}}],"services":{},"always_on_modules":{"octopus":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"pacific":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"quincy":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"reef":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"squid":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"tentacle":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"]},"force_disabled_modules":{},"last_failure_osd_epoch":0,"active_clients":[{"name":"devicehealth","addrvec":[{"type":"v2","addr":"192.168.123.104:0","nonce":4125180593}]},{"name":"libcephsqlite","addrvec":[{"type":"v2","addr":"192.168.123.104:0","nonce":174563998}]},{"name":"rbd_support","addrvec":[{"type":"v2","addr":"192.168.123.104:0","nonce":2400433533}]},{"name":"volumes","addrvec":[{"type":"v2","addr":"192.168.123.104:0","nonce":2971116904}]}]} 2026-03-23T17:30:52.131 INFO:tasks.ceph.ceph_manager.ceph:mgr available! 2026-03-23T17:30:52.131 INFO:tasks.ceph.ceph_manager.ceph:waiting for all up 2026-03-23T17:30:52.131 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd dump --format=json 2026-03-23T17:30:52.310 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:52.310 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":15,"fsid":"bd8fd485-7f5e-4dd1-a4fb-2f27337796a2","created":"2026-03-23T17:30:39.647861+0000","modified":"2026-03-23T17:30:51.682097+0000","last_up_change":"2026-03-23T17:30:43.838380+0000","last_in_change":"2026-03-23T17:30:40.456871+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":4,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":2,"max_osd":3,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"tentacle","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-23T17:30:44.476796+0000","flags":1,"flags_names":"hashpspool","type":1,"size":2,"min_size":1,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"is_stretch_pool":false,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"11","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"nonprimary_shards":"{}","options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_type":"Fair distribution","score_acting":2.9900000095367432,"score_stable":2.9900000095367432,"optimal_score":0.67000001668930054,"raw_score_acting":2,"raw_score_stable":2,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}},{"pool":2,"pool_name":"rbd","create_time":"2026-03-23T17:30:48.033244+0000","flags":8193,"flags_names":"hashpspool,selfmanaged_snaps","type":1,"size":2,"min_size":1,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"is_stretch_pool":false,"object_hash":2,"pg_autoscale_mode":"off","pg_num":8,"pg_placement_num":8,"pg_placement_num_target":8,"pg_num_target":8,"pg_num_pending":8,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"15","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":2,"snap_epoch":15,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"nonprimary_shards":"{}","options":{},"application_metadata":{"rbd":{}},"read_balance":{"score_type":"Fair distribution","score_acting":1.8799999952316284,"score_stable":1.8799999952316284,"optimal_score":1,"raw_score_acting":1.8799999952316284,"raw_score_stable":1.8799999952316284,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"d91b1ac9-820d-4f5b-83d6-7a015ca26296","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":12,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6802","nonce":2398517092},{"type":"v1","addr":"192.168.123.104:6804","nonce":2398517092}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6806","nonce":2398517092},{"type":"v1","addr":"192.168.123.104:6808","nonce":2398517092}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6814","nonce":2398517092},{"type":"v1","addr":"192.168.123.104:6815","nonce":2398517092}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6810","nonce":2398517092},{"type":"v1","addr":"192.168.123.104:6812","nonce":2398517092}]},"public_addr":"192.168.123.104:6804/2398517092","cluster_addr":"192.168.123.104:6808/2398517092","heartbeat_back_addr":"192.168.123.104:6815/2398517092","heartbeat_front_addr":"192.168.123.104:6812/2398517092","state":["exists","up"]},{"osd":1,"uuid":"b630c322-76af-4a70-9b41-7c22a53ab1d6","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":12,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6800","nonce":3728786032},{"type":"v1","addr":"192.168.123.104:6801","nonce":3728786032}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6803","nonce":3728786032},{"type":"v1","addr":"192.168.123.104:6805","nonce":3728786032}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6811","nonce":3728786032},{"type":"v1","addr":"192.168.123.104:6813","nonce":3728786032}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6807","nonce":3728786032},{"type":"v1","addr":"192.168.123.104:6809","nonce":3728786032}]},"public_addr":"192.168.123.104:6801/3728786032","cluster_addr":"192.168.123.104:6805/3728786032","heartbeat_back_addr":"192.168.123.104:6813/3728786032","heartbeat_front_addr":"192.168.123.104:6809/3728786032","state":["exists","up"]},{"osd":2,"uuid":"d8611821-f18a-4353-a7f8-27b1691155ff","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":12,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6816","nonce":2446644018},{"type":"v1","addr":"192.168.123.104:6817","nonce":2446644018}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6818","nonce":2446644018},{"type":"v1","addr":"192.168.123.104:6819","nonce":2446644018}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6822","nonce":2446644018},{"type":"v1","addr":"192.168.123.104:6823","nonce":2446644018}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6820","nonce":2446644018},{"type":"v1","addr":"192.168.123.104:6821","nonce":2446644018}]},"public_addr":"192.168.123.104:6817/2446644018","cluster_addr":"192.168.123.104:6819/2446644018","heartbeat_back_addr":"192.168.123.104:6823/2446644018","heartbeat_front_addr":"192.168.123.104:6821/2446644018","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"isa","technique":"reed_sol_van"}},"removed_snaps_queue":[{"pool":2,"snaps":[{"begin":2,"length":1}]}],"new_removed_snaps":[{"pool":2,"snaps":[{"begin":2,"length":1}]}],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-23T17:30:52.326 INFO:tasks.ceph.ceph_manager.ceph:all up! 2026-03-23T17:30:52.326 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd dump --format=json 2026-03-23T17:30:52.527 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:52.527 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":15,"fsid":"bd8fd485-7f5e-4dd1-a4fb-2f27337796a2","created":"2026-03-23T17:30:39.647861+0000","modified":"2026-03-23T17:30:51.682097+0000","last_up_change":"2026-03-23T17:30:43.838380+0000","last_in_change":"2026-03-23T17:30:40.456871+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":4,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":2,"max_osd":3,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"tentacle","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-23T17:30:44.476796+0000","flags":1,"flags_names":"hashpspool","type":1,"size":2,"min_size":1,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"is_stretch_pool":false,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"11","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"nonprimary_shards":"{}","options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_type":"Fair distribution","score_acting":2.9900000095367432,"score_stable":2.9900000095367432,"optimal_score":0.67000001668930054,"raw_score_acting":2,"raw_score_stable":2,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}},{"pool":2,"pool_name":"rbd","create_time":"2026-03-23T17:30:48.033244+0000","flags":8193,"flags_names":"hashpspool,selfmanaged_snaps","type":1,"size":2,"min_size":1,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"is_stretch_pool":false,"object_hash":2,"pg_autoscale_mode":"off","pg_num":8,"pg_placement_num":8,"pg_placement_num_target":8,"pg_num_target":8,"pg_num_pending":8,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"15","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":2,"snap_epoch":15,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"nonprimary_shards":"{}","options":{},"application_metadata":{"rbd":{}},"read_balance":{"score_type":"Fair distribution","score_acting":1.8799999952316284,"score_stable":1.8799999952316284,"optimal_score":1,"raw_score_acting":1.8799999952316284,"raw_score_stable":1.8799999952316284,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"d91b1ac9-820d-4f5b-83d6-7a015ca26296","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":12,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6802","nonce":2398517092},{"type":"v1","addr":"192.168.123.104:6804","nonce":2398517092}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6806","nonce":2398517092},{"type":"v1","addr":"192.168.123.104:6808","nonce":2398517092}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6814","nonce":2398517092},{"type":"v1","addr":"192.168.123.104:6815","nonce":2398517092}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6810","nonce":2398517092},{"type":"v1","addr":"192.168.123.104:6812","nonce":2398517092}]},"public_addr":"192.168.123.104:6804/2398517092","cluster_addr":"192.168.123.104:6808/2398517092","heartbeat_back_addr":"192.168.123.104:6815/2398517092","heartbeat_front_addr":"192.168.123.104:6812/2398517092","state":["exists","up"]},{"osd":1,"uuid":"b630c322-76af-4a70-9b41-7c22a53ab1d6","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":12,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6800","nonce":3728786032},{"type":"v1","addr":"192.168.123.104:6801","nonce":3728786032}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6803","nonce":3728786032},{"type":"v1","addr":"192.168.123.104:6805","nonce":3728786032}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6811","nonce":3728786032},{"type":"v1","addr":"192.168.123.104:6813","nonce":3728786032}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6807","nonce":3728786032},{"type":"v1","addr":"192.168.123.104:6809","nonce":3728786032}]},"public_addr":"192.168.123.104:6801/3728786032","cluster_addr":"192.168.123.104:6805/3728786032","heartbeat_back_addr":"192.168.123.104:6813/3728786032","heartbeat_front_addr":"192.168.123.104:6809/3728786032","state":["exists","up"]},{"osd":2,"uuid":"d8611821-f18a-4353-a7f8-27b1691155ff","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":12,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6816","nonce":2446644018},{"type":"v1","addr":"192.168.123.104:6817","nonce":2446644018}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6818","nonce":2446644018},{"type":"v1","addr":"192.168.123.104:6819","nonce":2446644018}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6822","nonce":2446644018},{"type":"v1","addr":"192.168.123.104:6823","nonce":2446644018}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6820","nonce":2446644018},{"type":"v1","addr":"192.168.123.104:6821","nonce":2446644018}]},"public_addr":"192.168.123.104:6817/2446644018","cluster_addr":"192.168.123.104:6819/2446644018","heartbeat_back_addr":"192.168.123.104:6823/2446644018","heartbeat_front_addr":"192.168.123.104:6821/2446644018","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"isa","technique":"reed_sol_van"}},"removed_snaps_queue":[{"pool":2,"snaps":[{"begin":2,"length":1}]}],"new_removed_snaps":[{"pool":2,"snaps":[{"begin":2,"length":1}]}],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-23T17:30:52.544 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph tell osd.0 flush_pg_stats 2026-03-23T17:30:52.545 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph tell osd.1 flush_pg_stats 2026-03-23T17:30:52.545 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph tell osd.2 flush_pg_stats 2026-03-23T17:30:52.655 INFO:teuthology.orchestra.run.vm04.stdout:34359738371 2026-03-23T17:30:52.655 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd last-stat-seq osd.2 2026-03-23T17:30:52.664 INFO:teuthology.orchestra.run.vm04.stdout:34359738371 2026-03-23T17:30:52.664 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd last-stat-seq osd.0 2026-03-23T17:30:52.679 INFO:teuthology.orchestra.run.vm04.stdout:34359738371 2026-03-23T17:30:52.679 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd last-stat-seq osd.1 2026-03-23T17:30:52.852 INFO:teuthology.orchestra.run.vm04.stdout:34359738370 2026-03-23T17:30:52.863 INFO:teuthology.orchestra.run.vm04.stdout:34359738370 2026-03-23T17:30:52.876 INFO:tasks.ceph.ceph_manager.ceph:need seq 34359738371 got 34359738370 for osd.2 2026-03-23T17:30:52.899 INFO:tasks.ceph.ceph_manager.ceph:need seq 34359738371 got 34359738370 for osd.0 2026-03-23T17:30:52.926 INFO:teuthology.orchestra.run.vm04.stdout:34359738370 2026-03-23T17:30:52.940 INFO:tasks.ceph.ceph_manager.ceph:need seq 34359738371 got 34359738370 for osd.1 2026-03-23T17:30:53.877 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd last-stat-seq osd.2 2026-03-23T17:30:53.900 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd last-stat-seq osd.0 2026-03-23T17:30:53.941 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd last-stat-seq osd.1 2026-03-23T17:30:54.075 INFO:teuthology.orchestra.run.vm04.stdout:34359738370 2026-03-23T17:30:54.089 INFO:tasks.ceph.ceph_manager.ceph:need seq 34359738371 got 34359738370 for osd.2 2026-03-23T17:30:54.117 INFO:teuthology.orchestra.run.vm04.stdout:34359738370 2026-03-23T17:30:54.132 INFO:tasks.ceph.ceph_manager.ceph:need seq 34359738371 got 34359738370 for osd.0 2026-03-23T17:30:54.143 INFO:teuthology.orchestra.run.vm04.stdout:34359738370 2026-03-23T17:30:54.157 INFO:tasks.ceph.ceph_manager.ceph:need seq 34359738371 got 34359738370 for osd.1 2026-03-23T17:30:55.090 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd last-stat-seq osd.2 2026-03-23T17:30:55.137 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd last-stat-seq osd.0 2026-03-23T17:30:55.158 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd last-stat-seq osd.1 2026-03-23T17:30:55.268 INFO:teuthology.orchestra.run.vm04.stdout:34359738371 2026-03-23T17:30:55.285 INFO:tasks.ceph.ceph_manager.ceph:need seq 34359738371 got 34359738371 for osd.2 2026-03-23T17:30:55.285 DEBUG:teuthology.parallel:result is None 2026-03-23T17:30:55.319 INFO:teuthology.orchestra.run.vm04.stdout:34359738371 2026-03-23T17:30:55.335 INFO:tasks.ceph.ceph_manager.ceph:need seq 34359738371 got 34359738371 for osd.0 2026-03-23T17:30:55.335 DEBUG:teuthology.parallel:result is None 2026-03-23T17:30:55.362 INFO:teuthology.orchestra.run.vm04.stdout:34359738371 2026-03-23T17:30:55.380 INFO:tasks.ceph.ceph_manager.ceph:need seq 34359738371 got 34359738371 for osd.1 2026-03-23T17:30:55.380 DEBUG:teuthology.parallel:result is None 2026-03-23T17:30:55.380 INFO:tasks.ceph.ceph_manager.ceph:waiting for clean 2026-03-23T17:30:55.380 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph pg dump --format=json 2026-03-23T17:30:55.598 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:55.599 INFO:teuthology.orchestra.run.vm04.stderr:dumped all 2026-03-23T17:30:55.614 INFO:teuthology.orchestra.run.vm04.stdout:{"pg_ready":true,"pg_map":{"version":17,"stamp":"2026-03-23T17:30:54.464414+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":459299,"num_objects":4,"num_object_clones":0,"num_object_copies":8,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":4,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":59,"num_write_kb":586,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":35,"ondisk_log_size":35,"up":18,"acting":18,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":18,"num_osds":3,"num_per_pool_osds":3,"num_per_pool_omap_osds":3,"kb":283115520,"kb_used":81408,"kb_used_data":856,"kb_used_omap":22,"kb_used_meta":80425,"kb_avail":283034112,"statfs":{"total":289910292480,"available":289826930688,"internally_reserved":0,"allocated":876544,"data_stored":1029399,"data_compressed":4658,"data_compressed_allocated":442368,"data_compressed_original":884736,"omap_allocated":23063,"internal_metadata":82355689},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":8,"apply_latency_ms":8,"commit_latency_ns":8000000,"apply_latency_ns":8000000},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"4.972336"},"pg_stats":[{"pgid":"2.7","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-23T17:30:51.694140+0000","last_change":"2026-03-23T17:30:51.694301+0000","last_active":"2026-03-23T17:30:51.694140+0000","last_peered":"2026-03-23T17:30:51.694140+0000","last_clean":"2026-03-23T17:30:51.694140+0000","last_became_active":"2026-03-23T17:30:49.499940+0000","last_became_peered":"2026-03-23T17:30:49.499940+0000","last_unstale":"2026-03-23T17:30:51.694140+0000","last_undegraded":"2026-03-23T17:30:51.694140+0000","last_fullsized":"2026-03-23T17:30:51.694140+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T18:00:58.180879+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.000678289,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.6","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-23T17:30:51.694159+0000","last_change":"2026-03-23T17:30:51.694315+0000","last_active":"2026-03-23T17:30:51.694159+0000","last_peered":"2026-03-23T17:30:51.694159+0000","last_clean":"2026-03-23T17:30:51.694159+0000","last_became_active":"2026-03-23T17:30:49.500491+0000","last_became_peered":"2026-03-23T17:30:49.500491+0000","last_unstale":"2026-03-23T17:30:51.694159+0000","last_undegraded":"2026-03-23T17:30:51.694159+0000","last_fullsized":"2026-03-23T17:30:51.694159+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T23:06:50.118782+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00057337200000000005,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.5","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-23T17:30:51.694334+0000","last_change":"2026-03-23T17:30:51.694436+0000","last_active":"2026-03-23T17:30:51.694334+0000","last_peered":"2026-03-23T17:30:51.694334+0000","last_clean":"2026-03-23T17:30:51.694334+0000","last_became_active":"2026-03-23T17:30:49.500052+0000","last_became_peered":"2026-03-23T17:30:49.500052+0000","last_unstale":"2026-03-23T17:30:51.694334+0000","last_undegraded":"2026-03-23T17:30:51.694334+0000","last_fullsized":"2026-03-23T17:30:51.694334+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T19:02:09.818760+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00060329899999999996,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.4","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-23T17:30:51.694460+0000","last_change":"2026-03-23T17:30:51.694557+0000","last_active":"2026-03-23T17:30:51.694460+0000","last_peered":"2026-03-23T17:30:51.694460+0000","last_clean":"2026-03-23T17:30:51.694460+0000","last_became_active":"2026-03-23T17:30:49.499862+0000","last_became_peered":"2026-03-23T17:30:49.499862+0000","last_unstale":"2026-03-23T17:30:51.694460+0000","last_undegraded":"2026-03-23T17:30:51.694460+0000","last_fullsized":"2026-03-23T17:30:51.694460+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T20:57:46.539497+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00064037800000000004,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.2","version":"15'2","reported_seq":22,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-23T17:30:51.694479+0000","last_change":"2026-03-23T17:30:51.694479+0000","last_active":"2026-03-23T17:30:51.694479+0000","last_peered":"2026-03-23T17:30:51.694479+0000","last_clean":"2026-03-23T17:30:51.694479+0000","last_became_active":"2026-03-23T17:30:49.499065+0000","last_became_peered":"2026-03-23T17:30:49.499065+0000","last_unstale":"2026-03-23T17:30:51.694479+0000","last_undegraded":"2026-03-23T17:30:51.694479+0000","last_fullsized":"2026-03-23T17:30:51.694479+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":2,"log_dups_size":0,"ondisk_log_size":2,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T02:34:59.316767+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.000806078,"stat_sum":{"num_bytes":19,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,1],"acting":[0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.1","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-23T17:30:51.712424+0000","last_change":"2026-03-23T17:30:51.712676+0000","last_active":"2026-03-23T17:30:51.712424+0000","last_peered":"2026-03-23T17:30:51.712424+0000","last_clean":"2026-03-23T17:30:51.712424+0000","last_became_active":"2026-03-23T17:30:49.499313+0000","last_became_peered":"2026-03-23T17:30:49.499313+0000","last_unstale":"2026-03-23T17:30:51.712424+0000","last_undegraded":"2026-03-23T17:30:51.712424+0000","last_fullsized":"2026-03-23T17:30:51.712424+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T02:06:03.323628+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00044566499999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.0","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-23T17:30:51.712617+0000","last_change":"2026-03-23T17:30:51.712764+0000","last_active":"2026-03-23T17:30:51.712617+0000","last_peered":"2026-03-23T17:30:51.712617+0000","last_clean":"2026-03-23T17:30:51.712617+0000","last_became_active":"2026-03-23T17:30:49.499316+0000","last_became_peered":"2026-03-23T17:30:49.499316+0000","last_unstale":"2026-03-23T17:30:51.712617+0000","last_undegraded":"2026-03-23T17:30:51.712617+0000","last_fullsized":"2026-03-23T17:30:51.712617+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T17:43:01.349684+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00040956599999999999,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.3","version":"13'1","reported_seq":21,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-23T17:30:51.694484+0000","last_change":"2026-03-23T17:30:51.694574+0000","last_active":"2026-03-23T17:30:51.694484+0000","last_peered":"2026-03-23T17:30:51.694484+0000","last_clean":"2026-03-23T17:30:51.694484+0000","last_became_active":"2026-03-23T17:30:49.498440+0000","last_became_peered":"2026-03-23T17:30:49.498440+0000","last_unstale":"2026-03-23T17:30:51.694484+0000","last_undegraded":"2026-03-23T17:30:51.694484+0000","last_fullsized":"2026-03-23T17:30:51.694484+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":1,"log_dups_size":0,"ondisk_log_size":1,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T00:28:04.686913+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00056242200000000003,"stat_sum":{"num_bytes":0,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2],"acting":[1,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"1.0","version":"10'32","reported_seq":64,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-23T17:30:51.694074+0000","last_change":"2026-03-23T17:30:46.635114+0000","last_active":"2026-03-23T17:30:51.694074+0000","last_peered":"2026-03-23T17:30:51.694074+0000","last_clean":"2026-03-23T17:30:51.694074+0000","last_became_active":"2026-03-23T17:30:46.634518+0000","last_became_peered":"2026-03-23T17:30:46.634518+0000","last_unstale":"2026-03-23T17:30:51.694074+0000","last_undegraded":"2026-03-23T17:30:51.694074+0000","last_fullsized":"2026-03-23T17:30:51.694074+0000","mapping_epoch":9,"log_start":"0'0","ondisk_log_start":"0'0","created":9,"last_epoch_clean":10,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:45.468031+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:45.468031+0000","last_clean_scrub_stamp":"2026-03-23T17:30:45.468031+0000","objects_scrubbed":0,"log_size":32,"log_dups_size":0,"ondisk_log_size":32,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T03:27:15.872490+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]}],"pool_stats":[{"poolid":2,"num_pg":8,"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":38,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":3,"ondisk_log_size":3,"up":16,"acting":16,"num_store_stats":3},{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":483328,"data_stored":918560,"data_compressed":4658,"data_compressed_allocated":442368,"data_compressed_original":884736,"omap_allocated":0,"internal_metadata":0},"log_size":32,"ondisk_log_size":32,"up":2,"acting":2,"num_store_stats":2}],"osd_stats":[{"osd":2,"up_from":8,"seq":34359738371,"num_pgs":3,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":26976,"kb_used_data":120,"kb_used_omap":6,"kb_used_meta":26809,"kb_avail":94344864,"statfs":{"total":96636764160,"available":96609140736,"internally_reserved":0,"allocated":122880,"data_stored":34563,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":6821,"internal_metadata":27452763},"hb_peers":[0,1],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":2,"apply_latency_ms":2,"commit_latency_ns":2000000,"apply_latency_ns":2000000},"alerts":[]},{"osd":1,"up_from":8,"seq":34359738371,"num_pgs":9,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":27216,"kb_used_data":368,"kb_used_omap":7,"kb_used_meta":26808,"kb_avail":94344624,"statfs":{"total":96636764160,"available":96608894976,"internally_reserved":0,"allocated":376832,"data_stored":497418,"data_compressed":2329,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":8121,"internal_metadata":27451463},"hb_peers":[0,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":3,"apply_latency_ms":3,"commit_latency_ns":3000000,"apply_latency_ns":3000000},"alerts":[]},{"osd":0,"up_from":8,"seq":34359738371,"num_pgs":6,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":27216,"kb_used_data":368,"kb_used_omap":7,"kb_used_meta":26808,"kb_avail":94344624,"statfs":{"total":96636764160,"available":96608894976,"internally_reserved":0,"allocated":376832,"data_stored":497418,"data_compressed":2329,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":8121,"internal_metadata":27451463},"hb_peers":[1,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":3,"apply_latency_ms":3,"commit_latency_ns":3000000,"apply_latency_ns":3000000},"alerts":[]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":241664,"data_stored":459280,"data_compressed":2329,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":241664,"data_stored":459280,"data_compressed":2329,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":4096,"data_stored":19,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":4096,"data_stored":19,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0}]}} 2026-03-23T17:30:55.615 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph pg dump --format=json 2026-03-23T17:30:55.787 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:55.787 INFO:teuthology.orchestra.run.vm04.stderr:dumped all 2026-03-23T17:30:55.802 INFO:teuthology.orchestra.run.vm04.stdout:{"pg_ready":true,"pg_map":{"version":17,"stamp":"2026-03-23T17:30:54.464414+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":459299,"num_objects":4,"num_object_clones":0,"num_object_copies":8,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":4,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":59,"num_write_kb":586,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":35,"ondisk_log_size":35,"up":18,"acting":18,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":18,"num_osds":3,"num_per_pool_osds":3,"num_per_pool_omap_osds":3,"kb":283115520,"kb_used":81408,"kb_used_data":856,"kb_used_omap":22,"kb_used_meta":80425,"kb_avail":283034112,"statfs":{"total":289910292480,"available":289826930688,"internally_reserved":0,"allocated":876544,"data_stored":1029399,"data_compressed":4658,"data_compressed_allocated":442368,"data_compressed_original":884736,"omap_allocated":23063,"internal_metadata":82355689},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":8,"apply_latency_ms":8,"commit_latency_ns":8000000,"apply_latency_ns":8000000},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"4.972336"},"pg_stats":[{"pgid":"2.7","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-23T17:30:51.694140+0000","last_change":"2026-03-23T17:30:51.694301+0000","last_active":"2026-03-23T17:30:51.694140+0000","last_peered":"2026-03-23T17:30:51.694140+0000","last_clean":"2026-03-23T17:30:51.694140+0000","last_became_active":"2026-03-23T17:30:49.499940+0000","last_became_peered":"2026-03-23T17:30:49.499940+0000","last_unstale":"2026-03-23T17:30:51.694140+0000","last_undegraded":"2026-03-23T17:30:51.694140+0000","last_fullsized":"2026-03-23T17:30:51.694140+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T18:00:58.180879+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.000678289,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.6","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-23T17:30:51.694159+0000","last_change":"2026-03-23T17:30:51.694315+0000","last_active":"2026-03-23T17:30:51.694159+0000","last_peered":"2026-03-23T17:30:51.694159+0000","last_clean":"2026-03-23T17:30:51.694159+0000","last_became_active":"2026-03-23T17:30:49.500491+0000","last_became_peered":"2026-03-23T17:30:49.500491+0000","last_unstale":"2026-03-23T17:30:51.694159+0000","last_undegraded":"2026-03-23T17:30:51.694159+0000","last_fullsized":"2026-03-23T17:30:51.694159+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T23:06:50.118782+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00057337200000000005,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.5","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-23T17:30:51.694334+0000","last_change":"2026-03-23T17:30:51.694436+0000","last_active":"2026-03-23T17:30:51.694334+0000","last_peered":"2026-03-23T17:30:51.694334+0000","last_clean":"2026-03-23T17:30:51.694334+0000","last_became_active":"2026-03-23T17:30:49.500052+0000","last_became_peered":"2026-03-23T17:30:49.500052+0000","last_unstale":"2026-03-23T17:30:51.694334+0000","last_undegraded":"2026-03-23T17:30:51.694334+0000","last_fullsized":"2026-03-23T17:30:51.694334+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T19:02:09.818760+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00060329899999999996,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.4","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-23T17:30:51.694460+0000","last_change":"2026-03-23T17:30:51.694557+0000","last_active":"2026-03-23T17:30:51.694460+0000","last_peered":"2026-03-23T17:30:51.694460+0000","last_clean":"2026-03-23T17:30:51.694460+0000","last_became_active":"2026-03-23T17:30:49.499862+0000","last_became_peered":"2026-03-23T17:30:49.499862+0000","last_unstale":"2026-03-23T17:30:51.694460+0000","last_undegraded":"2026-03-23T17:30:51.694460+0000","last_fullsized":"2026-03-23T17:30:51.694460+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T20:57:46.539497+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00064037800000000004,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.2","version":"15'2","reported_seq":22,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-23T17:30:51.694479+0000","last_change":"2026-03-23T17:30:51.694479+0000","last_active":"2026-03-23T17:30:51.694479+0000","last_peered":"2026-03-23T17:30:51.694479+0000","last_clean":"2026-03-23T17:30:51.694479+0000","last_became_active":"2026-03-23T17:30:49.499065+0000","last_became_peered":"2026-03-23T17:30:49.499065+0000","last_unstale":"2026-03-23T17:30:51.694479+0000","last_undegraded":"2026-03-23T17:30:51.694479+0000","last_fullsized":"2026-03-23T17:30:51.694479+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":2,"log_dups_size":0,"ondisk_log_size":2,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T02:34:59.316767+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.000806078,"stat_sum":{"num_bytes":19,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,1],"acting":[0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.1","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-23T17:30:51.712424+0000","last_change":"2026-03-23T17:30:51.712676+0000","last_active":"2026-03-23T17:30:51.712424+0000","last_peered":"2026-03-23T17:30:51.712424+0000","last_clean":"2026-03-23T17:30:51.712424+0000","last_became_active":"2026-03-23T17:30:49.499313+0000","last_became_peered":"2026-03-23T17:30:49.499313+0000","last_unstale":"2026-03-23T17:30:51.712424+0000","last_undegraded":"2026-03-23T17:30:51.712424+0000","last_fullsized":"2026-03-23T17:30:51.712424+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T02:06:03.323628+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00044566499999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.0","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-23T17:30:51.712617+0000","last_change":"2026-03-23T17:30:51.712764+0000","last_active":"2026-03-23T17:30:51.712617+0000","last_peered":"2026-03-23T17:30:51.712617+0000","last_clean":"2026-03-23T17:30:51.712617+0000","last_became_active":"2026-03-23T17:30:49.499316+0000","last_became_peered":"2026-03-23T17:30:49.499316+0000","last_unstale":"2026-03-23T17:30:51.712617+0000","last_undegraded":"2026-03-23T17:30:51.712617+0000","last_fullsized":"2026-03-23T17:30:51.712617+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T17:43:01.349684+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00040956599999999999,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.3","version":"13'1","reported_seq":21,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-23T17:30:51.694484+0000","last_change":"2026-03-23T17:30:51.694574+0000","last_active":"2026-03-23T17:30:51.694484+0000","last_peered":"2026-03-23T17:30:51.694484+0000","last_clean":"2026-03-23T17:30:51.694484+0000","last_became_active":"2026-03-23T17:30:49.498440+0000","last_became_peered":"2026-03-23T17:30:49.498440+0000","last_unstale":"2026-03-23T17:30:51.694484+0000","last_undegraded":"2026-03-23T17:30:51.694484+0000","last_fullsized":"2026-03-23T17:30:51.694484+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":1,"log_dups_size":0,"ondisk_log_size":1,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T00:28:04.686913+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00056242200000000003,"stat_sum":{"num_bytes":0,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2],"acting":[1,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"1.0","version":"10'32","reported_seq":64,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-23T17:30:51.694074+0000","last_change":"2026-03-23T17:30:46.635114+0000","last_active":"2026-03-23T17:30:51.694074+0000","last_peered":"2026-03-23T17:30:51.694074+0000","last_clean":"2026-03-23T17:30:51.694074+0000","last_became_active":"2026-03-23T17:30:46.634518+0000","last_became_peered":"2026-03-23T17:30:46.634518+0000","last_unstale":"2026-03-23T17:30:51.694074+0000","last_undegraded":"2026-03-23T17:30:51.694074+0000","last_fullsized":"2026-03-23T17:30:51.694074+0000","mapping_epoch":9,"log_start":"0'0","ondisk_log_start":"0'0","created":9,"last_epoch_clean":10,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:45.468031+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:45.468031+0000","last_clean_scrub_stamp":"2026-03-23T17:30:45.468031+0000","objects_scrubbed":0,"log_size":32,"log_dups_size":0,"ondisk_log_size":32,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T03:27:15.872490+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]}],"pool_stats":[{"poolid":2,"num_pg":8,"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":38,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":3,"ondisk_log_size":3,"up":16,"acting":16,"num_store_stats":3},{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":483328,"data_stored":918560,"data_compressed":4658,"data_compressed_allocated":442368,"data_compressed_original":884736,"omap_allocated":0,"internal_metadata":0},"log_size":32,"ondisk_log_size":32,"up":2,"acting":2,"num_store_stats":2}],"osd_stats":[{"osd":2,"up_from":8,"seq":34359738371,"num_pgs":3,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":26976,"kb_used_data":120,"kb_used_omap":6,"kb_used_meta":26809,"kb_avail":94344864,"statfs":{"total":96636764160,"available":96609140736,"internally_reserved":0,"allocated":122880,"data_stored":34563,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":6821,"internal_metadata":27452763},"hb_peers":[0,1],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":2,"apply_latency_ms":2,"commit_latency_ns":2000000,"apply_latency_ns":2000000},"alerts":[]},{"osd":1,"up_from":8,"seq":34359738371,"num_pgs":9,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":27216,"kb_used_data":368,"kb_used_omap":7,"kb_used_meta":26808,"kb_avail":94344624,"statfs":{"total":96636764160,"available":96608894976,"internally_reserved":0,"allocated":376832,"data_stored":497418,"data_compressed":2329,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":8121,"internal_metadata":27451463},"hb_peers":[0,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":3,"apply_latency_ms":3,"commit_latency_ns":3000000,"apply_latency_ns":3000000},"alerts":[]},{"osd":0,"up_from":8,"seq":34359738371,"num_pgs":6,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":27216,"kb_used_data":368,"kb_used_omap":7,"kb_used_meta":26808,"kb_avail":94344624,"statfs":{"total":96636764160,"available":96608894976,"internally_reserved":0,"allocated":376832,"data_stored":497418,"data_compressed":2329,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":8121,"internal_metadata":27451463},"hb_peers":[1,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":3,"apply_latency_ms":3,"commit_latency_ns":3000000,"apply_latency_ns":3000000},"alerts":[]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":241664,"data_stored":459280,"data_compressed":2329,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":241664,"data_stored":459280,"data_compressed":2329,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":4096,"data_stored":19,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":4096,"data_stored":19,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0}]}} 2026-03-23T17:30:55.802 INFO:tasks.ceph.ceph_manager.ceph:clean! 2026-03-23T17:30:55.802 INFO:tasks.ceph:Waiting until ceph cluster ceph is healthy... 2026-03-23T17:30:55.802 INFO:tasks.ceph.ceph_manager.ceph:wait_until_healthy 2026-03-23T17:30:55.802 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph health --format=json 2026-03-23T17:30:55.992 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T17:30:55.992 INFO:teuthology.orchestra.run.vm04.stdout:{"status":"HEALTH_OK","checks":{},"mutes":[]} 2026-03-23T17:30:56.006 INFO:tasks.ceph.ceph_manager.ceph:wait_until_healthy done 2026-03-23T17:30:56.006 INFO:teuthology.run_tasks:Running task exec... 2026-03-23T17:30:56.008 INFO:teuthology.task.exec:Executing custom commands... 2026-03-23T17:30:56.008 INFO:teuthology.task.exec:Running commands on role client.0 host ubuntu@vm04.local 2026-03-23T17:30:56.008 DEBUG:teuthology.orchestra.run.vm04:> sudo TESTDIR=/home/ubuntu/cephtest bash -c 'sudo ceph osd erasure-code-profile set teuthologyprofile crush-failure-domain=osd m=1 k=2' 2026-03-23T17:30:56.736 INFO:teuthology.orchestra.run.vm04.stderr:load: isa 2026-03-23T17:30:56.752 DEBUG:teuthology.orchestra.run.vm04:> sudo TESTDIR=/home/ubuntu/cephtest bash -c 'sudo ceph osd pool create datapool 4 4 erasure teuthologyprofile' 2026-03-23T17:30:58.751 INFO:teuthology.orchestra.run.vm04.stderr:pool 'datapool' created 2026-03-23T17:30:58.770 DEBUG:teuthology.orchestra.run.vm04:> sudo TESTDIR=/home/ubuntu/cephtest bash -c 'sudo ceph osd pool set datapool allow_ec_overwrites true' 2026-03-23T17:30:59.640 INFO:teuthology.orchestra.run.vm04.stderr:set pool 3 allow_ec_overwrites to true 2026-03-23T17:30:59.664 DEBUG:teuthology.orchestra.run.vm04:> sudo TESTDIR=/home/ubuntu/cephtest bash -c 'rbd pool init datapool' 2026-03-23T17:31:02.739 INFO:teuthology.run_tasks:Running task workunit... 2026-03-23T17:31:02.746 INFO:tasks.workunit:Pulling workunits from ref 0392f78529848ec72469e8e431875cb98d3a5fb4 2026-03-23T17:31:02.746 INFO:tasks.workunit:Making a separate scratch dir for every client... 2026-03-23T17:31:02.746 DEBUG:teuthology.orchestra.run.vm04:> stat -- /home/ubuntu/cephtest/mnt.0 2026-03-23T17:31:02.750 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-23T17:31:02.750 INFO:teuthology.orchestra.run.vm04.stderr:stat: cannot statx '/home/ubuntu/cephtest/mnt.0': No such file or directory 2026-03-23T17:31:02.750 DEBUG:teuthology.orchestra.run.vm04:> mkdir -- /home/ubuntu/cephtest/mnt.0 2026-03-23T17:31:02.795 INFO:tasks.workunit:Created dir /home/ubuntu/cephtest/mnt.0 2026-03-23T17:31:02.796 DEBUG:teuthology.orchestra.run.vm04:> cd -- /home/ubuntu/cephtest/mnt.0 && mkdir -- client.0 2026-03-23T17:31:02.839 INFO:tasks.workunit:timeout=3h 2026-03-23T17:31:02.839 INFO:tasks.workunit:cleanup=True 2026-03-23T17:31:02.839 DEBUG:teuthology.orchestra.run.vm04:> rm -rf /home/ubuntu/cephtest/clone.client.0 && git clone https://github.com/kshtsk/ceph.git /home/ubuntu/cephtest/clone.client.0 && cd /home/ubuntu/cephtest/clone.client.0 && git checkout 0392f78529848ec72469e8e431875cb98d3a5fb4 2026-03-23T17:31:02.884 INFO:tasks.workunit.client.0.vm04.stderr:Cloning into '/home/ubuntu/cephtest/clone.client.0'... 2026-03-23T17:31:42.302 INFO:tasks.workunit.client.0.vm04.stderr:Note: switching to '0392f78529848ec72469e8e431875cb98d3a5fb4'. 2026-03-23T17:31:42.302 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-23T17:31:42.302 INFO:tasks.workunit.client.0.vm04.stderr:You are in 'detached HEAD' state. You can look around, make experimental 2026-03-23T17:31:42.302 INFO:tasks.workunit.client.0.vm04.stderr:changes and commit them, and you can discard any commits you make in this 2026-03-23T17:31:42.302 INFO:tasks.workunit.client.0.vm04.stderr:state without impacting any branches by switching back to a branch. 2026-03-23T17:31:42.302 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-23T17:31:42.302 INFO:tasks.workunit.client.0.vm04.stderr:If you want to create a new branch to retain commits you create, you may 2026-03-23T17:31:42.302 INFO:tasks.workunit.client.0.vm04.stderr:do so (now or later) by using -c with the switch command. Example: 2026-03-23T17:31:42.302 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-23T17:31:42.302 INFO:tasks.workunit.client.0.vm04.stderr: git switch -c 2026-03-23T17:31:42.302 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-23T17:31:42.302 INFO:tasks.workunit.client.0.vm04.stderr:Or undo this operation with: 2026-03-23T17:31:42.302 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-23T17:31:42.302 INFO:tasks.workunit.client.0.vm04.stderr: git switch - 2026-03-23T17:31:42.302 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-23T17:31:42.302 INFO:tasks.workunit.client.0.vm04.stderr:Turn off this advice by setting config variable advice.detachedHead to false 2026-03-23T17:31:42.302 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-23T17:31:42.302 INFO:tasks.workunit.client.0.vm04.stderr:HEAD is now at 0392f785298 qa/tasks/keystone: restart mariadb for rocky and alma linux too 2026-03-23T17:31:42.309 DEBUG:teuthology.orchestra.run.vm04:> cd -- /home/ubuntu/cephtest/clone.client.0/qa/workunits && if test -e Makefile ; then make ; fi && find -executable -type f -printf '%P\0' >/home/ubuntu/cephtest/workunits.list.client.0 2026-03-23T17:31:42.355 INFO:tasks.workunit.client.0.vm04.stdout:for d in direct_io fs ; do ( cd $d ; make all ) ; done 2026-03-23T17:31:42.356 INFO:tasks.workunit.client.0.vm04.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/direct_io' 2026-03-23T17:31:42.356 INFO:tasks.workunit.client.0.vm04.stdout:cc -Wall -Wextra -D_GNU_SOURCE direct_io_test.c -o direct_io_test 2026-03-23T17:31:42.397 INFO:tasks.workunit.client.0.vm04.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_sync_io.c -o test_sync_io 2026-03-23T17:31:42.427 INFO:tasks.workunit.client.0.vm04.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_short_dio_read.c -o test_short_dio_read 2026-03-23T17:31:42.452 INFO:tasks.workunit.client.0.vm04.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/direct_io' 2026-03-23T17:31:42.452 INFO:tasks.workunit.client.0.vm04.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/fs' 2026-03-23T17:31:42.453 INFO:tasks.workunit.client.0.vm04.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_o_trunc.c -o test_o_trunc 2026-03-23T17:31:42.477 INFO:tasks.workunit.client.0.vm04.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/fs' 2026-03-23T17:31:42.480 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-23T17:31:42.480 DEBUG:teuthology.orchestra.run.vm04:> dd if=/home/ubuntu/cephtest/workunits.list.client.0 of=/dev/stdout 2026-03-23T17:31:42.527 INFO:tasks.workunit:Running workunits matching rbd/cli_generic.sh on client.0... 2026-03-23T17:31:42.528 INFO:tasks.workunit:Running workunit rbd/cli_generic.sh... 2026-03-23T17:31:42.528 DEBUG:teuthology.orchestra.run.vm04:workunit test rbd/cli_generic.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=0392f78529848ec72469e8e431875cb98d3a5fb4 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/rbd/cli_generic.sh 2026-03-23T17:31:42.575 INFO:tasks.workunit.client.0.vm04.stderr:+ export RBD_FORCE_ALLOW_V1=1 2026-03-23T17:31:42.575 INFO:tasks.workunit.client.0.vm04.stderr:+ RBD_FORCE_ALLOW_V1=1 2026-03-23T17:31:42.575 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-23T17:31:42.575 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T17:31:42.575 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -v '^0$' 2026-03-23T17:31:42.598 INFO:tasks.workunit.client.0.vm04.stderr:+ IMGS='testimg1 testimg2 testimg3 testimg4 testimg5 testimg6 testimg-diff1 testimg-diff2 testimg-diff3 foo foo2 bar bar2 test1 test2 test3 test4 clone2' 2026-03-23T17:31:42.598 INFO:tasks.workunit.client.0.vm04.stderr:+ tiered=0 2026-03-23T17:31:42.598 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd dump 2026-03-23T17:31:42.598 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^pool' 2026-03-23T17:31:42.598 INFO:tasks.workunit.client.0.vm04.stderr:+ grep ''\''rbd'\''' 2026-03-23T17:31:42.598 INFO:tasks.workunit.client.0.vm04.stderr:+ grep tier 2026-03-23T17:31:42.820 INFO:tasks.workunit.client.0.vm04.stderr:+ test_pool_image_args 2026-03-23T17:31:42.820 INFO:tasks.workunit.client.0.vm04.stdout:testing pool and image args... 2026-03-23T17:31:42.821 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing pool and image args...' 2026-03-23T17:31:42.821 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-23T17:31:42.821 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:42.876 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:42.929 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:42.981 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:43.034 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:43.086 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:43.139 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:43.191 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:43.245 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:43.303 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:43.357 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:43.412 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:43.468 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:43.523 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:43.575 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:43.636 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:43.689 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:43.745 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:43.798 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool delete test test --yes-i-really-really-mean-it 2026-03-23T17:31:44.006 INFO:tasks.workunit.client.0.vm04.stderr:pool 'test' does not exist 2026-03-23T17:31:44.019 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool create test 32 2026-03-23T17:31:44.652 INFO:tasks.workunit.client.0.vm04.stderr:pool 'test' already exists 2026-03-23T17:31:44.664 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd pool init test 2026-03-23T17:31:47.615 INFO:tasks.workunit.client.0.vm04.stderr:+ truncate -s 1 /tmp/empty /tmp/empty@snap 2026-03-23T17:31:47.617 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-23T17:31:47.617 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T17:31:47.617 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 0 2026-03-23T17:31:47.646 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-23T17:31:47.647 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create -s 1 test1 2026-03-23T17:31:47.677 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-23T17:31:47.677 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -q test1 2026-03-23T17:31:47.700 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd import --image test2 /tmp/empty 2026-03-23T17:31:47.733 INFO:tasks.workunit.client.0.vm04.stderr: Importing image: 100% complete...done. 2026-03-23T17:31:47.733 INFO:tasks.workunit.client.0.vm04.stderr:rbd: --image is deprecated, use --dest 2026-03-23T17:31:47.737 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-23T17:31:47.737 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -q test2 2026-03-23T17:31:47.760 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd --dest test3 import /tmp/empty 2026-03-23T17:31:47.792 INFO:tasks.workunit.client.0.vm04.stderr: Importing image: 100% complete...done. 2026-03-23T17:31:47.795 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-23T17:31:47.795 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -q test3 2026-03-23T17:31:47.817 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd import /tmp/empty foo 2026-03-23T17:31:47.850 INFO:tasks.workunit.client.0.vm04.stderr: Importing image: 100% complete...done. 2026-03-23T17:31:47.854 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-23T17:31:47.854 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -q foo 2026-03-23T17:31:47.876 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd import --dest test/empty@snap /tmp/empty 2026-03-23T17:31:47.890 INFO:tasks.workunit.client.0.vm04.stderr:rbd: destination snapshot name specified for a command that doesn't use it 2026-03-23T17:31:47.891 INFO:tasks.workunit.client.0.vm04.stderr:+ true 2026-03-23T17:31:47.891 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd import /tmp/empty test/empty@snap 2026-03-23T17:31:47.908 INFO:tasks.workunit.client.0.vm04.stderr:rbd: destination snapshot name specified for a command that doesn't use it 2026-03-23T17:31:47.909 INFO:tasks.workunit.client.0.vm04.stderr:+ true 2026-03-23T17:31:47.909 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd import --image test/empty@snap /tmp/empty 2026-03-23T17:31:47.924 INFO:tasks.workunit.client.0.vm04.stderr:rbd: destination snapshot name specified for a command that doesn't use it 2026-03-23T17:31:47.924 INFO:tasks.workunit.client.0.vm04.stderr:rbd: --image is deprecated, use --dest 2026-03-23T17:31:47.926 INFO:tasks.workunit.client.0.vm04.stderr:+ true 2026-03-23T17:31:47.926 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd import /tmp/empty@snap 2026-03-23T17:31:47.939 INFO:tasks.workunit.client.0.vm04.stderr:rbd: destination snapshot name specified for a command that doesn't use it 2026-03-23T17:31:47.941 INFO:tasks.workunit.client.0.vm04.stderr:+ true 2026-03-23T17:31:47.941 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls test 2026-03-23T17:31:47.941 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T17:31:47.941 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 0 2026-03-23T17:31:47.964 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-23T17:31:47.964 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd import /tmp/empty test/test1 2026-03-23T17:31:47.997 INFO:tasks.workunit.client.0.vm04.stderr: Importing image: 100% complete...done. 2026-03-23T17:31:48.001 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls test 2026-03-23T17:31:48.001 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -q test1 2026-03-23T17:31:48.024 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd -p test import /tmp/empty test2 2026-03-23T17:31:48.056 INFO:tasks.workunit.client.0.vm04.stderr: Importing image: 100% complete...done. 2026-03-23T17:31:48.056 INFO:tasks.workunit.client.0.vm04.stderr:rbd: -p [ --pool ] is deprecated, use --dest-pool 2026-03-23T17:31:48.059 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls test 2026-03-23T17:31:48.059 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -q test2 2026-03-23T17:31:48.081 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd --image test3 -p test import /tmp/empty 2026-03-23T17:31:48.113 INFO:tasks.workunit.client.0.vm04.stderr: Importing image: 100% complete...done. 2026-03-23T17:31:48.113 INFO:tasks.workunit.client.0.vm04.stderr:rbd: --image is deprecated, use --dest 2026-03-23T17:31:48.113 INFO:tasks.workunit.client.0.vm04.stderr:rbd: -p [ --pool ] is deprecated, use --dest-pool 2026-03-23T17:31:48.116 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls test 2026-03-23T17:31:48.116 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -q test3 2026-03-23T17:31:48.138 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd --image test4 -p test import /tmp/empty 2026-03-23T17:31:48.173 INFO:tasks.workunit.client.0.vm04.stderr: Importing image: 100% complete...done. 2026-03-23T17:31:48.173 INFO:tasks.workunit.client.0.vm04.stderr:rbd: --image is deprecated, use --dest 2026-03-23T17:31:48.173 INFO:tasks.workunit.client.0.vm04.stderr:rbd: -p [ --pool ] is deprecated, use --dest-pool 2026-03-23T17:31:48.176 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls test 2026-03-23T17:31:48.176 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -q test4 2026-03-23T17:31:48.200 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd --dest test5 -p test import /tmp/empty 2026-03-23T17:31:48.233 INFO:tasks.workunit.client.0.vm04.stderr: Importing image: 100% complete...done. 2026-03-23T17:31:48.233 INFO:tasks.workunit.client.0.vm04.stderr:rbd: -p [ --pool ] is deprecated, use --dest-pool 2026-03-23T17:31:48.236 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls test 2026-03-23T17:31:48.236 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -q test5 2026-03-23T17:31:48.259 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd --dest test6 --dest-pool test import /tmp/empty 2026-03-23T17:31:48.293 INFO:tasks.workunit.client.0.vm04.stderr: Importing image: 100% complete...done. 2026-03-23T17:31:48.297 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls test 2026-03-23T17:31:48.297 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -q test6 2026-03-23T17:31:48.321 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd --image test7 --dest-pool test import /tmp/empty 2026-03-23T17:31:48.355 INFO:tasks.workunit.client.0.vm04.stderr: Importing image: 100% complete...done. 2026-03-23T17:31:48.356 INFO:tasks.workunit.client.0.vm04.stderr:rbd: --image is deprecated, use --dest 2026-03-23T17:31:48.359 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls test 2026-03-23T17:31:48.359 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -q test7 2026-03-23T17:31:48.382 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd --image test/test8 import /tmp/empty 2026-03-23T17:31:48.413 INFO:tasks.workunit.client.0.vm04.stderr: Importing image: 100% complete...done. 2026-03-23T17:31:48.413 INFO:tasks.workunit.client.0.vm04.stderr:rbd: --image is deprecated, use --dest 2026-03-23T17:31:48.416 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls test 2026-03-23T17:31:48.416 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -q test8 2026-03-23T17:31:48.439 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd --dest test/test9 import /tmp/empty 2026-03-23T17:31:48.473 INFO:tasks.workunit.client.0.vm04.stderr: Importing image: 100% complete...done. 2026-03-23T17:31:48.477 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls test 2026-03-23T17:31:48.477 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -q test9 2026-03-23T17:31:48.500 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd import --pool test /tmp/empty 2026-03-23T17:31:48.532 INFO:tasks.workunit.client.0.vm04.stderr: Importing image: 100% complete...done. 2026-03-23T17:31:48.533 INFO:tasks.workunit.client.0.vm04.stderr:rbd: -p [ --pool ] is deprecated, use --dest-pool 2026-03-23T17:31:48.536 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls test 2026-03-23T17:31:48.536 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -q empty 2026-03-23T17:31:48.558 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd copy test/test9 test10 2026-03-23T17:31:48.592 INFO:tasks.workunit.client.0.vm04.stderr: Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-23T17:31:48.595 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls test 2026-03-23T17:31:48.595 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -qv test10 2026-03-23T17:31:48.620 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-23T17:31:48.621 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -q test10 2026-03-23T17:31:48.642 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd copy test/test9 test/test10 2026-03-23T17:31:48.676 INFO:tasks.workunit.client.0.vm04.stderr: Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-23T17:31:48.679 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls test 2026-03-23T17:31:48.680 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -q test10 2026-03-23T17:31:48.702 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd copy --pool test test10 --dest-pool test test11 2026-03-23T17:31:48.738 INFO:tasks.workunit.client.0.vm04.stderr: Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-23T17:31:48.742 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls test 2026-03-23T17:31:48.742 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -q test11 2026-03-23T17:31:48.765 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd copy --dest-pool rbd --pool test test11 test12 2026-03-23T17:31:48.802 INFO:tasks.workunit.client.0.vm04.stderr: Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-23T17:31:48.805 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-23T17:31:48.806 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test12 2026-03-23T17:31:48.827 INFO:tasks.workunit.client.0.vm04.stdout:test12 2026-03-23T17:31:48.827 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls test 2026-03-23T17:31:48.827 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -qv test12 2026-03-23T17:31:48.851 INFO:tasks.workunit.client.0.vm04.stderr:+ rm -f /tmp/empty /tmp/empty@snap 2026-03-23T17:31:48.852 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool delete test test --yes-i-really-really-mean-it 2026-03-23T17:31:49.681 INFO:tasks.workunit.client.0.vm04.stderr:pool 'test' does not exist 2026-03-23T17:31:49.693 INFO:tasks.workunit.client.0.vm04.stderr:+ for f in foo test1 test10 test12 test2 test3 2026-03-23T17:31:49.693 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm foo 2026-03-23T17:31:49.752 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:31:49.750+0000 7fd53ab73640 0 -- 192.168.123.104:0/1511528961 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x557d88849f20 msgr2=0x557d8887b6d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T17:31:49.755 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:31:49.759 INFO:tasks.workunit.client.0.vm04.stderr:+ for f in foo test1 test10 test12 test2 test3 2026-03-23T17:31:49.759 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test1 2026-03-23T17:31:49.820 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:31:49.814+0000 7faa02d33640 0 -- 192.168.123.104:0/1472777636 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x563cc14cff20 msgr2=0x563cc15016d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:31:49.821 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:31:49.825 INFO:tasks.workunit.client.0.vm04.stderr:+ for f in foo test1 test10 test12 test2 test3 2026-03-23T17:31:49.825 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test10 2026-03-23T17:31:49.879 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:31:49.874+0000 7f9318954640 0 -- 192.168.123.104:0/1271236403 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7f92f805c6e0 msgr2=0x7f92f807cae0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:31:49.883 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:31:49.886 INFO:tasks.workunit.client.0.vm04.stderr:+ for f in foo test1 test10 test12 test2 test3 2026-03-23T17:31:49.886 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test12 2026-03-23T17:31:49.940 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:31:49.934+0000 7f38f1397640 0 -- 192.168.123.104:0/3455280081 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x556eb6b6fb50 msgr2=0x556eb6ba1700 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:31:49.942 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:31:49.946 INFO:tasks.workunit.client.0.vm04.stderr:+ for f in foo test1 test10 test12 test2 test3 2026-03-23T17:31:49.946 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test2 2026-03-23T17:31:50.004 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:31:50.002+0000 7f30b431f640 0 -- 192.168.123.104:0/3286708633 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55a0a0193bb0 msgr2=0x55a0a014b300 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:31:50.006 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:31:50.010 INFO:tasks.workunit.client.0.vm04.stderr:+ for f in foo test1 test10 test12 test2 test3 2026-03-23T17:31:50.010 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test3 2026-03-23T17:31:50.066 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:31:50.062+0000 7f9225fc4640 0 -- 192.168.123.104:0/3686062763 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x558c08b80f20 msgr2=0x558c08bb2640 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:31:50.067 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:31:50.070 INFO:tasks.workunit.client.0.vm04.stderr:+ test_rename 2026-03-23T17:31:50.070 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing rename...' 2026-03-23T17:31:50.070 INFO:tasks.workunit.client.0.vm04.stdout:testing rename... 2026-03-23T17:31:50.070 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-23T17:31:50.070 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:50.125 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:50.183 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:50.239 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:50.296 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:50.352 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:50.406 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:50.460 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:50.518 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:50.574 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:50.629 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:50.683 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:50.742 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:50.796 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:50.850 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:50.903 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:50.957 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:51.011 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:51.063 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 1 -s 1 foo 2026-03-23T17:31:51.080 INFO:tasks.workunit.client.0.vm04.stderr:rbd: image format 1 is deprecated 2026-03-23T17:31:51.087 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:31:51.082+0000 7fb5e0436200 -1 librbd: Forced V1 image creation. 2026-03-23T17:31:51.092 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 -s 1 bar 2026-03-23T17:31:51.123 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rename foo foo2 2026-03-23T17:31:51.155 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rename foo2 bar 2026-03-23T17:31:51.155 INFO:tasks.workunit.client.0.vm04.stderr:+ grep exists 2026-03-23T17:31:51.183 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-23T17:31:51.174+0000 7f9624557200 -1 librbd::Operations: rbd image bar already exists 2026-03-23T17:31:51.183 INFO:tasks.workunit.client.0.vm04.stdout:rbd: rename error: (17) File exists 2026-03-23T17:31:51.183 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rename bar bar2 2026-03-23T17:31:51.219 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rename bar2 foo2 2026-03-23T17:31:51.219 INFO:tasks.workunit.client.0.vm04.stderr:+ grep exists 2026-03-23T17:31:51.247 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-23T17:31:51.238+0000 7f13d09ca200 -1 librbd::Operations: rbd image foo2 already exists 2026-03-23T17:31:51.247 INFO:tasks.workunit.client.0.vm04.stdout:rbd: rename error: (17) File exists 2026-03-23T17:31:51.247 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool create rbd2 8 2026-03-23T17:31:51.682 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd2' already exists 2026-03-23T17:31:51.696 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd pool init rbd2 2026-03-23T17:31:54.650 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create -p rbd2 -s 1 foo 2026-03-23T17:31:54.681 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rename rbd2/foo rbd2/bar 2026-03-23T17:31:54.716 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd -p rbd2 ls 2026-03-23T17:31:54.717 INFO:tasks.workunit.client.0.vm04.stderr:+ grep bar 2026-03-23T17:31:54.740 INFO:tasks.workunit.client.0.vm04.stdout:bar 2026-03-23T17:31:54.740 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rename rbd2/bar foo 2026-03-23T17:31:54.775 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rename --pool rbd2 foo bar 2026-03-23T17:31:54.813 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rename rbd2/bar --dest-pool rbd foo 2026-03-23T17:31:54.828 INFO:tasks.workunit.client.0.vm04.stderr:rbd: mv/rename across pools not supported 2026-03-23T17:31:54.828 INFO:tasks.workunit.client.0.vm04.stderr:source pool: rbd2 dest pool: rbd 2026-03-23T17:31:54.829 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rename --pool rbd2 bar --dest-pool rbd2 foo 2026-03-23T17:31:55.071 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd -p rbd2 ls 2026-03-23T17:31:55.071 INFO:tasks.workunit.client.0.vm04.stderr:+ grep foo 2026-03-23T17:31:55.094 INFO:tasks.workunit.client.0.vm04.stdout:foo 2026-03-23T17:31:55.094 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool rm rbd2 rbd2 --yes-i-really-really-mean-it 2026-03-23T17:31:55.698 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd2' does not exist 2026-03-23T17:31:55.710 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-23T17:31:55.710 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:55.767 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:56.026 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:56.080 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:56.134 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:56.187 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:56.239 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:56.292 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:56.345 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:56.397 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:56.452 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:56.510 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:56.561 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:56.854 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:56.909 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:56.966 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:57.019 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:57.073 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:57.126 INFO:tasks.workunit.client.0.vm04.stdout:testing ls... 2026-03-23T17:31:57.126 INFO:tasks.workunit.client.0.vm04.stderr:+ test_ls 2026-03-23T17:31:57.126 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing ls...' 2026-03-23T17:31:57.126 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-23T17:31:57.126 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:57.178 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:57.233 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:57.288 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:57.343 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:57.396 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:57.446 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:57.499 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:57.552 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:57.608 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:57.666 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:57.727 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:57.782 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:57.839 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:57.892 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:57.945 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:57.999 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:58.053 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:58.109 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 1 -s 1 test1 2026-03-23T17:31:58.124 INFO:tasks.workunit.client.0.vm04.stderr:rbd: image format 1 is deprecated 2026-03-23T17:31:58.130 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:31:58.126+0000 7f4dfc766200 -1 librbd: Forced V1 image creation. 2026-03-23T17:31:58.136 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 1 -s 1 test2 2026-03-23T17:31:58.151 INFO:tasks.workunit.client.0.vm04.stderr:rbd: image format 1 is deprecated 2026-03-23T17:31:58.158 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:31:58.154+0000 7f7fdc79a200 -1 librbd: Forced V1 image creation. 2026-03-23T17:31:58.163 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-23T17:31:58.163 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test1 2026-03-23T17:31:58.185 INFO:tasks.workunit.client.0.vm04.stdout:test1 2026-03-23T17:31:58.185 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-23T17:31:58.185 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test2 2026-03-23T17:31:58.205 INFO:tasks.workunit.client.0.vm04.stdout:test2 2026-03-23T17:31:58.205 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-23T17:31:58.205 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T17:31:58.205 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 2 2026-03-23T17:31:58.228 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-23T17:31:58.228 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls -l 2026-03-23T17:31:58.228 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'test1.*1 MiB.*1' 2026-03-23T17:31:58.253 INFO:tasks.workunit.client.0.vm04.stdout:test1 1 MiB 1 2026-03-23T17:31:58.253 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls -l 2026-03-23T17:31:58.253 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'test2.*1 MiB.*1' 2026-03-23T17:31:58.277 INFO:tasks.workunit.client.0.vm04.stdout:test2 1 MiB 1 2026-03-23T17:31:58.277 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test1 2026-03-23T17:31:58.303 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:31:58.306 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test2 2026-03-23T17:31:58.332 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:31:58.335 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 -s 1 test1 2026-03-23T17:31:58.365 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 -s 1 test2 2026-03-23T17:31:58.394 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-23T17:31:58.394 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test1 2026-03-23T17:31:58.415 INFO:tasks.workunit.client.0.vm04.stdout:test1 2026-03-23T17:31:58.415 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-23T17:31:58.415 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test2 2026-03-23T17:31:58.436 INFO:tasks.workunit.client.0.vm04.stdout:test2 2026-03-23T17:31:58.436 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-23T17:31:58.436 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T17:31:58.436 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 2 2026-03-23T17:31:58.457 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-23T17:31:58.457 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls -l 2026-03-23T17:31:58.457 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'test1.*1 MiB.*2' 2026-03-23T17:31:58.487 INFO:tasks.workunit.client.0.vm04.stdout:test1 1 MiB 2 2026-03-23T17:31:58.488 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls -l 2026-03-23T17:31:58.488 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'test2.*1 MiB.*2' 2026-03-23T17:31:58.517 INFO:tasks.workunit.client.0.vm04.stdout:test2 1 MiB 2 2026-03-23T17:31:58.517 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test1 2026-03-23T17:31:58.577 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:31:58.574+0000 7f75358a3640 0 -- 192.168.123.104:0/3037428515 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x558724642f20 msgr2=0x5587246746d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:31:58.579 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:31:58.583 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test2 2026-03-23T17:31:58.635 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:31:58.630+0000 7f9441d38640 0 -- 192.168.123.104:0/50240286 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55ab4200af20 msgr2=0x55ab4203c6f0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:31:58.641 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:31:58.644 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 -s 1 test1 2026-03-23T17:31:58.679 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 1 -s 1 test2 2026-03-23T17:31:58.693 INFO:tasks.workunit.client.0.vm04.stderr:rbd: image format 1 is deprecated 2026-03-23T17:31:58.700 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:31:58.694+0000 7fed4b36c200 -1 librbd: Forced V1 image creation. 2026-03-23T17:31:58.705 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-23T17:31:58.706 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test1 2026-03-23T17:31:58.727 INFO:tasks.workunit.client.0.vm04.stdout:test1 2026-03-23T17:31:58.728 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-23T17:31:58.728 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test2 2026-03-23T17:31:58.751 INFO:tasks.workunit.client.0.vm04.stdout:test2 2026-03-23T17:31:58.751 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-23T17:31:58.751 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T17:31:58.751 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 2 2026-03-23T17:31:58.776 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-23T17:31:58.776 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls -l 2026-03-23T17:31:58.776 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'test1.*1 MiB.*2' 2026-03-23T17:31:58.807 INFO:tasks.workunit.client.0.vm04.stdout:test1 1 MiB 2 2026-03-23T17:31:58.807 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls -l 2026-03-23T17:31:58.807 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'test2.*1 MiB.*1' 2026-03-23T17:31:58.836 INFO:tasks.workunit.client.0.vm04.stdout:test2 1 MiB 1 2026-03-23T17:31:58.836 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-23T17:31:58.836 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:58.907 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:59.172 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:59.229 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:59.292 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:59.355 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:59.613 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:59.674 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:59.737 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:31:59.797 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:32:00.060 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:32:00.125 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:32:00.193 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:32:00.253 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:32:00.346 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:32:00.403 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:32:00.456 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:32:00.513 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T17:32:00.566 INFO:tasks.workunit.client.0.vm04.stderr:++ seq -w 00 99 2026-03-23T17:32:00.566 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:00.567 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.00 -s 1 2026-03-23T17:32:00.596 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:00.596 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.01 -s 1 2026-03-23T17:32:00.632 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:00.632 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.02 -s 1 2026-03-23T17:32:00.663 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:00.663 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.03 -s 1 2026-03-23T17:32:00.693 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:00.693 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.04 -s 1 2026-03-23T17:32:00.723 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:00.723 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.05 -s 1 2026-03-23T17:32:00.754 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:00.754 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.06 -s 1 2026-03-23T17:32:00.784 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:00.784 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.07 -s 1 2026-03-23T17:32:00.816 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:00.816 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.08 -s 1 2026-03-23T17:32:00.844 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:00.844 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.09 -s 1 2026-03-23T17:32:01.076 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:01.077 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.10 -s 1 2026-03-23T17:32:01.108 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:01.108 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.11 -s 1 2026-03-23T17:32:01.138 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:01.138 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.12 -s 1 2026-03-23T17:32:01.170 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:01.170 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.13 -s 1 2026-03-23T17:32:01.202 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:01.202 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.14 -s 1 2026-03-23T17:32:01.234 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:01.234 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.15 -s 1 2026-03-23T17:32:01.266 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:01.266 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.16 -s 1 2026-03-23T17:32:01.299 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:01.299 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.17 -s 1 2026-03-23T17:32:01.330 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:01.330 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.18 -s 1 2026-03-23T17:32:01.363 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:01.363 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.19 -s 1 2026-03-23T17:32:01.394 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:01.394 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.20 -s 1 2026-03-23T17:32:01.425 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:01.426 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.21 -s 1 2026-03-23T17:32:01.458 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:01.458 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.22 -s 1 2026-03-23T17:32:01.489 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:01.489 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.23 -s 1 2026-03-23T17:32:01.523 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:01.523 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.24 -s 1 2026-03-23T17:32:01.555 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:01.555 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.25 -s 1 2026-03-23T17:32:01.588 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:01.588 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.26 -s 1 2026-03-23T17:32:01.624 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:01.624 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.27 -s 1 2026-03-23T17:32:01.661 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:01.661 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.28 -s 1 2026-03-23T17:32:01.694 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:01.694 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.29 -s 1 2026-03-23T17:32:01.729 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:01.729 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.30 -s 1 2026-03-23T17:32:01.762 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:01.762 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.31 -s 1 2026-03-23T17:32:01.798 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:01.798 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.32 -s 1 2026-03-23T17:32:01.833 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:01.833 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.33 -s 1 2026-03-23T17:32:01.867 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:01.867 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.34 -s 1 2026-03-23T17:32:01.902 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:01.902 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.35 -s 1 2026-03-23T17:32:01.935 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:01.935 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.36 -s 1 2026-03-23T17:32:01.967 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:01.967 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.37 -s 1 2026-03-23T17:32:02.001 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:02.001 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.38 -s 1 2026-03-23T17:32:02.037 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:02.037 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.39 -s 1 2026-03-23T17:32:02.071 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:02.071 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.40 -s 1 2026-03-23T17:32:02.107 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:02.107 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.41 -s 1 2026-03-23T17:32:02.141 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:02.141 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.42 -s 1 2026-03-23T17:32:02.177 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:02.177 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.43 -s 1 2026-03-23T17:32:02.209 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:02.209 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.44 -s 1 2026-03-23T17:32:02.241 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:02.241 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.45 -s 1 2026-03-23T17:32:02.273 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:02.273 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.46 -s 1 2026-03-23T17:32:02.310 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:02.310 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.47 -s 1 2026-03-23T17:32:02.344 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:02.344 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.48 -s 1 2026-03-23T17:32:02.377 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:02.377 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.49 -s 1 2026-03-23T17:32:02.410 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:02.411 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.50 -s 1 2026-03-23T17:32:02.444 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:02.444 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.51 -s 1 2026-03-23T17:32:02.479 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:02.480 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.52 -s 1 2026-03-23T17:32:02.521 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:02.521 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.53 -s 1 2026-03-23T17:32:02.556 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:02.556 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.54 -s 1 2026-03-23T17:32:02.592 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:02.592 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.55 -s 1 2026-03-23T17:32:02.625 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:02.625 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.56 -s 1 2026-03-23T17:32:02.658 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:02.658 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.57 -s 1 2026-03-23T17:32:02.691 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:02.692 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.58 -s 1 2026-03-23T17:32:02.730 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:02.730 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.59 -s 1 2026-03-23T17:32:02.764 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:02.764 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.60 -s 1 2026-03-23T17:32:02.797 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:02.797 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.61 -s 1 2026-03-23T17:32:02.831 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:02.831 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.62 -s 1 2026-03-23T17:32:02.866 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:02.867 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.63 -s 1 2026-03-23T17:32:02.903 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:02.904 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.64 -s 1 2026-03-23T17:32:02.939 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:02.939 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.65 -s 1 2026-03-23T17:32:02.973 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:02.973 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.66 -s 1 2026-03-23T17:32:03.006 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:03.006 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.67 -s 1 2026-03-23T17:32:03.041 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:03.041 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.68 -s 1 2026-03-23T17:32:03.077 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:03.077 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.69 -s 1 2026-03-23T17:32:03.114 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:03.114 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.70 -s 1 2026-03-23T17:32:03.147 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:03.147 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.71 -s 1 2026-03-23T17:32:03.179 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:03.179 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.72 -s 1 2026-03-23T17:32:03.212 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:03.212 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.73 -s 1 2026-03-23T17:32:03.250 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:03.250 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.74 -s 1 2026-03-23T17:32:03.287 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:03.287 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.75 -s 1 2026-03-23T17:32:03.320 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:03.320 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.76 -s 1 2026-03-23T17:32:03.352 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:03.353 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.77 -s 1 2026-03-23T17:32:03.387 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:03.387 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.78 -s 1 2026-03-23T17:32:03.419 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:03.419 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.79 -s 1 2026-03-23T17:32:03.452 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:03.452 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.80 -s 1 2026-03-23T17:32:03.482 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:03.482 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.81 -s 1 2026-03-23T17:32:03.514 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:03.514 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.82 -s 1 2026-03-23T17:32:03.548 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:03.548 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.83 -s 1 2026-03-23T17:32:03.581 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:03.581 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.84 -s 1 2026-03-23T17:32:03.614 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:03.614 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.85 -s 1 2026-03-23T17:32:03.648 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:03.648 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.86 -s 1 2026-03-23T17:32:03.684 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:03.684 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.87 -s 1 2026-03-23T17:32:03.738 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:03.738 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.88 -s 1 2026-03-23T17:32:03.771 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:03.771 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.89 -s 1 2026-03-23T17:32:03.806 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:03.806 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.90 -s 1 2026-03-23T17:32:03.839 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:03.839 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.91 -s 1 2026-03-23T17:32:03.877 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:03.877 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.92 -s 1 2026-03-23T17:32:03.912 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:03.912 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.93 -s 1 2026-03-23T17:32:03.943 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:03.943 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.94 -s 1 2026-03-23T17:32:03.989 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:03.989 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.95 -s 1 2026-03-23T17:32:04.029 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:04.029 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.96 -s 1 2026-03-23T17:32:04.064 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:04.064 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.97 -s 1 2026-03-23T17:32:04.099 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:04.099 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.98 -s 1 2026-03-23T17:32:04.132 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:04.132 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.99 -s 1 2026-03-23T17:32:04.165 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-23T17:32:04.165 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T17:32:04.165 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 100 2026-03-23T17:32:04.190 INFO:tasks.workunit.client.0.vm04.stdout:100 2026-03-23T17:32:04.191 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls -l 2026-03-23T17:32:04.191 INFO:tasks.workunit.client.0.vm04.stderr:+ grep image 2026-03-23T17:32:04.191 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T17:32:04.191 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 100 2026-03-23T17:32:04.236 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:04.234+0000 7f4473ac1640 0 -- 192.168.123.104:0/2406716685 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55abfaacb740 msgr2=0x55abfab0f550 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:04.240 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:04.238+0000 7f4473ac1640 0 -- 192.168.123.104:0/2406716685 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7f445405c7d0 msgr2=0x7f445407cbd0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T17:32:04.429 INFO:tasks.workunit.client.0.vm04.stdout:100 2026-03-23T17:32:04.429 INFO:tasks.workunit.client.0.vm04.stderr:++ seq -w 00 99 2026-03-23T17:32:04.430 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:04.430 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.00 2026-03-23T17:32:04.496 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:04.494+0000 7f72b1c6c640 0 -- 192.168.123.104:0/1699485371 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x7f72980023a0 msgr2=0x558084e98540 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:04.499 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:04.505 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:04.505 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.01 2026-03-23T17:32:04.581 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:04.585 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:04.585 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.02 2026-03-23T17:32:04.653 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:04.650+0000 7f522713d640 0 -- 192.168.123.104:0/2221932630 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7f520405c690 msgr2=0x7f520407ca90 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:04.654 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:04.658 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:04.659 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.03 2026-03-23T17:32:04.809 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:04.813 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:04.813 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.04 2026-03-23T17:32:04.874 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:04.870+0000 7f3a1efa3640 0 -- 192.168.123.104:0/1716051660 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x5562143bcf20 msgr2=0x5562143ee640 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T17:32:04.877 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:04.881 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:04.881 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.05 2026-03-23T17:32:04.949 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:04.946+0000 7f3aa8776640 0 -- 192.168.123.104:0/2805405297 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x5621aea9cb50 msgr2=0x5621aeace5b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:04.949 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:04.952 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:04.953 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.06 2026-03-23T17:32:05.016 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:05.010+0000 7f58f3fff640 0 -- 192.168.123.104:0/252100567 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7f58d0008d30 msgr2=0x7f58d00291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:05.019 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:05.024 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:05.024 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.07 2026-03-23T17:32:05.092 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:05.086+0000 7fd08bb08640 0 -- 192.168.123.104:0/1638372566 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7fd0780237d0 msgr2=0x55747760a090 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:05.096 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:05.099 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:05.099 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.08 2026-03-23T17:32:05.161 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:05.158+0000 7f09e75dc640 0 -- 192.168.123.104:0/675803172 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55be0a659090 msgr2=0x55be0a73cca0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:05.163 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:05.167 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:05.167 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.09 2026-03-23T17:32:05.229 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:05.226+0000 7fa3617ba640 0 -- 192.168.123.104:0/1502323518 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7fa344012da0 msgr2=0x7fa344013210 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:05.231 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:05.235 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:05.235 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.10 2026-03-23T17:32:05.312 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:05.310+0000 7faa3b892640 0 -- 192.168.123.104:0/1732025948 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x5593ef728b50 msgr2=0x5593ef75a670 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:05.312 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:05.316 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:05.316 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.11 2026-03-23T17:32:05.382 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:05.378+0000 7f7fdf212640 0 -- 192.168.123.104:0/517422963 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7f7fbc05c700 msgr2=0x7f7fbc07cb00 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:05.385 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:05.389 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:05.389 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.12 2026-03-23T17:32:05.463 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:05.458+0000 7f861cc96640 0 -- 192.168.123.104:0/2968106094 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7f85f4008d30 msgr2=0x7f85f40291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T17:32:05.467 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:05.471 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:05.472 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.13 2026-03-23T17:32:05.540 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:05.538+0000 7f9758997640 0 -- 192.168.123.104:0/617915425 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55e0fd0adb50 msgr2=0x55e0fd0df670 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:05.540 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:05.545 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:05.545 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.14 2026-03-23T17:32:05.615 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:05.610+0000 7f83a3742640 0 -- 192.168.123.104:0/220510586 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7f837c008d70 msgr2=0x7f837c0291f0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:05.618 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:05.622 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:05.622 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.15 2026-03-23T17:32:05.689 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:05.686+0000 7f9e4f6ce640 0 -- 192.168.123.104:0/177918337 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55e13c0f6880 msgr2=0x55e13c0e6780 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:05.691 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:05.694 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:05.694 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.16 2026-03-23T17:32:05.783 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:05.778+0000 7fc3fe93f640 0 -- 192.168.123.104:0/3057679047 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x559ec6ed0b50 msgr2=0x559ec6f02650 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T17:32:05.786 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:05.790 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:05.790 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.17 2026-03-23T17:32:05.859 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:05.854+0000 7fa083a3a640 0 -- 192.168.123.104:0/1003252201 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x559e561d3f20 msgr2=0x559e562056d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:05.862 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:05.867 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:05.867 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.18 2026-03-23T17:32:05.926 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:05.922+0000 7f179c9fd640 0 -- 192.168.123.104:0/3340221995 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7f1774008d30 msgr2=0x7f17740291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:05.929 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:05.933 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:05.933 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.19 2026-03-23T17:32:05.998 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:05.994+0000 7f28928cf640 0 -- 192.168.123.104:0/2547838704 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7f286c008d30 msgr2=0x7f286c0291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:06.002 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:06.006 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:06.006 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.20 2026-03-23T17:32:06.071 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:06.066+0000 7f26a9a84640 0 -- 192.168.123.104:0/3323946495 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x5641047b9b50 msgr2=0x5641047eb6b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:06.071 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:06.074 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:06.075 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.21 2026-03-23T17:32:06.144 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:06.138+0000 7ff88e9b1640 0 -- 192.168.123.104:0/4131053658 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x556e76871420 msgr2=0x556e768a3590 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:06.147 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:06.151 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:06.151 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.22 2026-03-23T17:32:06.224 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:06.229 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:06.229 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.23 2026-03-23T17:32:06.334 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:06.330+0000 7f872b271640 0 -- 192.168.123.104:0/3946065089 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55d027dc6b50 msgr2=0x55d027df8700 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T17:32:06.337 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:06.341 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:06.341 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.24 2026-03-23T17:32:06.604 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:06.602+0000 7fb72d300640 0 -- 192.168.123.104:0/3765287 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x56016be3fb50 msgr2=0x56016be71650 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:06.616 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:06.620 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:06.621 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.25 2026-03-23T17:32:06.688 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:06.682+0000 7fe53a42c640 0 -- 192.168.123.104:0/2061540185 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55c282e43090 msgr2=0x55c282f27150 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:06.689 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:06.693 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:06.693 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.26 2026-03-23T17:32:06.757 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:06.754+0000 7ff97f070640 0 -- 192.168.123.104:0/871442990 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55d190eac090 msgr2=0x55d190f8fce0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:06.760 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:06.765 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:06.765 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.27 2026-03-23T17:32:06.856 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:06.859 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:06.859 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.28 2026-03-23T17:32:06.926 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:06.930 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:06.930 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.29 2026-03-23T17:32:07.004 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:06.998+0000 7faf9be3c640 0 -- 192.168.123.104:0/935204653 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55d94c939f20 msgr2=0x55d94c96fc20 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:07.007 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:07.011 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:07.011 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.30 2026-03-23T17:32:07.080 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:07.074+0000 7fd7d8af3640 0 -- 192.168.123.104:0/1753139143 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55917fddcf20 msgr2=0x55917fe0e6d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:07.082 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:07.086 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:07.086 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.31 2026-03-23T17:32:07.156 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:07.154+0000 7f4ebc2a4640 0 -- 192.168.123.104:0/2017001287 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x558da640a090 msgr2=0x558da64edca0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:07.157 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:07.160 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:07.160 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.32 2026-03-23T17:32:07.225 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:07.222+0000 7fd89aa8b640 0 -- 192.168.123.104:0/1814538759 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7fd87805c7d0 msgr2=0x7fd87807cbd0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:07.229 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:07.233 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:07.233 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.33 2026-03-23T17:32:07.300 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:07.298+0000 7f7500790640 0 -- 192.168.123.104:0/3176829849 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7f74e005c630 msgr2=0x7f74e007ca30 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T17:32:07.309 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:07.319 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:07.319 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.34 2026-03-23T17:32:07.462 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:07.458+0000 7f0f41fd7640 0 -- 192.168.123.104:0/2790388165 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x555e40602f20 msgr2=0x555e406346d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:07.463 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:07.467 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:07.467 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.35 2026-03-23T17:32:07.538 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:07.534+0000 7f0820177640 0 -- 192.168.123.104:0/231284604 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x5567629d6090 msgr2=0x556762ab9ca0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:07.541 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:07.545 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:07.545 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.36 2026-03-23T17:32:07.623 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:07.618+0000 7fe8e8f79640 0 -- 192.168.123.104:0/2425154539 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55d26b2e7830 msgr2=0x55d26b2c37b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:07.623 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:07.628 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:07.628 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.37 2026-03-23T17:32:08.308 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:08.306+0000 7f836e762640 0 -- 192.168.123.104:0/3603636372 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x557206238b50 msgr2=0x55720626a4c0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:08.312 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:08.316 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:08.316 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.38 2026-03-23T17:32:08.545 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:08.542+0000 7fc0bb2f1640 0 -- 192.168.123.104:0/2695311801 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x556fc97c6b50 msgr2=0x556fc97f8700 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:08.549 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:08.555 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:08.555 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.39 2026-03-23T17:32:08.630 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:08.626+0000 7f35401ad640 0 -- 192.168.123.104:0/787612700 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55856d64e090 msgr2=0x55856d731ce0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:08.630 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:08.635 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:08.635 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.40 2026-03-23T17:32:08.708 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:08.706+0000 7f8d859a4640 0 -- 192.168.123.104:0/3157239972 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55b9a4dc1b50 msgr2=0x55b9a4d9bbd0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:08.709 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:08.713 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:08.713 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.41 2026-03-23T17:32:08.805 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:08.810 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:08.810 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.42 2026-03-23T17:32:08.883 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:08.878+0000 7f204cc98640 0 -- 192.168.123.104:0/211664432 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x5618ab284b50 msgr2=0x5618ab2b6650 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:08.885 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:08.889 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:08.889 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.43 2026-03-23T17:32:08.973 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:08.970+0000 7f461e89b640 0 -- 192.168.123.104:0/2429255825 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x5641e9a7ab50 msgr2=0x5641e9aac650 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:08.975 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:08.979 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:08.979 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.44 2026-03-23T17:32:09.074 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:09.070+0000 7fe4477a3640 0 -- 192.168.123.104:0/46692327 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x556ece767f20 msgr2=0x556ece799640 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:09.074 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:09.079 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:09.079 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.45 2026-03-23T17:32:09.153 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:09.150+0000 7f3e08e3c640 0 -- 192.168.123.104:0/1305444031 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x561712fa9880 msgr2=0x561712f99780 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:09.155 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:09.159 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:09.159 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.46 2026-03-23T17:32:09.237 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:09.241 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:09.242 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.47 2026-03-23T17:32:09.362 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:09.358+0000 7f5d52de2640 0 -- 192.168.123.104:0/2769100956 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x563729153f20 msgr2=0x5637291856d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:09.364 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:09.369 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:09.369 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.48 2026-03-23T17:32:09.437 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:09.434+0000 7f93bae0b640 0 -- 192.168.123.104:0/2160308276 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x5573780a3f20 msgr2=0x5573780d56d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T17:32:09.440 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:09.444 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:09.444 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.49 2026-03-23T17:32:09.531 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:09.526+0000 7f27c1153640 0 -- 192.168.123.104:0/1740853694 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7f2798000ec0 msgr2=0x7f2798040440 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:09.535 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:09.540 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:09.540 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.50 2026-03-23T17:32:09.627 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:09.622+0000 7f7ef92f4640 0 -- 192.168.123.104:0/4049081417 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x563d62479090 msgr2=0x563d6255cca0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:09.630 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:09.634 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:09.634 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.51 2026-03-23T17:32:09.709 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:09.706+0000 7f2012db6640 0 -- 192.168.123.104:0/2920053888 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x564a11031f20 msgr2=0x564a110636d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:09.712 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:09.716 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:09.716 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.52 2026-03-23T17:32:09.788 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:09.782+0000 7f9890bd8640 0 -- 192.168.123.104:0/1736175237 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55c559bedb50 msgr2=0x55c559c1f4c0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:09.788 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:09.793 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:09.793 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.53 2026-03-23T17:32:09.905 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:09.902+0000 7f6b109dd640 0 -- 192.168.123.104:0/233132904 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7f6ae80130d0 msgr2=0x7f6ae8013540 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T17:32:09.908 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:09.911 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:09.912 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.54 2026-03-23T17:32:09.990 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:09.986+0000 7f82283f2640 0 -- 192.168.123.104:0/2364065620 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x5620be6d3090 msgr2=0x5620be7b6ca0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:09.994 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:09.998 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:09.998 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.55 2026-03-23T17:32:10.079 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:10.074+0000 7f7aaa0e3640 0 -- 192.168.123.104:0/153610546 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x557889557bf0 msgr2=0x557889492cc0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:10.083 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:10.087 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:10.087 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.56 2026-03-23T17:32:10.169 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:10.166+0000 7f434b7fe640 0 -- 192.168.123.104:0/2090497536 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7f4328008d30 msgr2=0x7f43280291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:10.175 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:10.180 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:10.180 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.57 2026-03-23T17:32:10.277 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:10.274+0000 7fdfb232d640 0 -- 192.168.123.104:0/1540356585 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7fdf8c012df0 msgr2=0x7fdf8c013260 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T17:32:10.282 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:10.287 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:10.287 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.58 2026-03-23T17:32:10.368 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:10.362+0000 7f3f50bb5640 0 -- 192.168.123.104:0/364844712 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x5655522bff20 msgr2=0x5655522f16d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:10.368 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:10.372 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:10.372 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.59 2026-03-23T17:32:10.487 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:10.491 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:10.491 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.60 2026-03-23T17:32:10.578 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:10.574+0000 7efe36322640 0 -- 192.168.123.104:0/2328629557 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x563722ea1b50 msgr2=0x563722ed36b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:10.578 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:10.582 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:10.582 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.61 2026-03-23T17:32:10.651 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:10.646+0000 7f17b2e26640 0 -- 192.168.123.104:0/1639833015 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7f178c008d30 msgr2=0x7f178c0291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:10.654 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:10.658 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:10.658 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.62 2026-03-23T17:32:10.754 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:10.750+0000 7f0fef903640 0 -- 192.168.123.104:0/3976464272 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x5576120b2420 msgr2=0x5576120e4530 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:10.754 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:10.758 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:10.758 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.63 2026-03-23T17:32:10.835 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:10.830+0000 7fbc79fdb640 0 -- 192.168.123.104:0/363663733 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x560f080fbf20 msgr2=0x560f0812d6d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:10.836 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:10.841 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:10.841 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.64 2026-03-23T17:32:10.911 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:10.906+0000 7f545ef6d640 0 -- 192.168.123.104:0/839620298 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x556e76afbb50 msgr2=0x556e76b2d5b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:10.915 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:10.920 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:10.920 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.65 2026-03-23T17:32:10.984 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:10.982+0000 7fbd229f6640 0 -- 192.168.123.104:0/2721509421 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x558005e3b5e0 msgr2=0x558005e5ba60 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:10.988 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:10.993 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:10.993 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.66 2026-03-23T17:32:11.064 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:11.058+0000 7faa6f751640 0 -- 192.168.123.104:0/2175529903 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7faa48008d30 msgr2=0x7faa480291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:11.068 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:11.073 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:11.073 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.67 2026-03-23T17:32:11.144 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:11.142+0000 7f557abf9640 0 -- 192.168.123.104:0/3107812575 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x557fb2e13f20 msgr2=0x557fb2e45640 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:11.148 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:11.153 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:11.153 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.68 2026-03-23T17:32:11.234 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:11.230+0000 7f550b6e1640 0 -- 192.168.123.104:0/1523329122 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7f54e4012da0 msgr2=0x7f54e4013210 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:11.238 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:11.242 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:11.242 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.69 2026-03-23T17:32:11.315 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:11.310+0000 7f1169d56640 0 -- 192.168.123.104:0/1542832620 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7f114c008d30 msgr2=0x7f114c0291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:11.318 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:11.323 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:11.323 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.70 2026-03-23T17:32:11.395 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:11.390+0000 7fb0a49b6640 0 -- 192.168.123.104:0/2394779541 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55969f23ab50 msgr2=0x55969f26c650 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T17:32:11.402 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:11.407 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:11.407 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.71 2026-03-23T17:32:11.498 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:11.494+0000 7f87353db640 0 -- 192.168.123.104:0/4014185674 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x5638deb6b090 msgr2=0x5638dec4ece0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T17:32:11.503 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:11.508 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:11.508 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.72 2026-03-23T17:32:11.582 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:11.578+0000 7f2fdd70c640 0 -- 192.168.123.104:0/4126964750 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55a339832ad0 msgr2=0x55a3398444c0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:11.586 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:11.590 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:11.590 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.73 2026-03-23T17:32:11.838 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:11.842 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:11.843 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.74 2026-03-23T17:32:11.917 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:11.921 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:11.921 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.75 2026-03-23T17:32:12.003 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:11.998+0000 7f0e060d5640 0 -- 192.168.123.104:0/2172068222 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x558c09c5d820 msgr2=0x558c09c91810 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:12.003 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:12.007 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:12.007 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.76 2026-03-23T17:32:12.078 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:12.082 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:12.082 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.77 2026-03-23T17:32:12.174 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:12.170+0000 7f63dbd37640 0 -- 192.168.123.104:0/3820721344 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55d3aeebb880 msgr2=0x55d3aeeab780 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:12.178 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:12.183 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:12.183 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.78 2026-03-23T17:32:12.251 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:12.246+0000 7f0931def640 0 -- 192.168.123.104:0/2881697901 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7f091400d230 msgr2=0x7f0914012ee0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:12.458 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:12.463 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:12.463 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.79 2026-03-23T17:32:12.569 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:12.566+0000 7f83f334a640 0 -- 192.168.123.104:0/1807552121 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x556a23a33b50 msgr2=0x556a23a69c30 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:12.570 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:12.574 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:12.574 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.80 2026-03-23T17:32:12.853 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:12.850+0000 7f4d76393640 0 -- 192.168.123.104:0/3560134550 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x5560bbbebf20 msgr2=0x5560bbc1d640 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:12.855 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:12.860 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:12.860 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.81 2026-03-23T17:32:12.936 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:12.930+0000 7fb5990d7640 0 -- 192.168.123.104:0/2324071516 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7fb57c012d90 msgr2=0x7fb57c013200 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:12.939 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:12.943 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:12.943 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.82 2026-03-23T17:32:13.027 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:13.022+0000 7f359f0f1640 0 -- 192.168.123.104:0/3068150536 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55b302a5ff20 msgr2=0x55b302a916d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:13.027 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:13.032 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:13.032 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.83 2026-03-23T17:32:13.114 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:13.110+0000 7f1b553e8640 0 -- 192.168.123.104:0/2895697315 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x5560299e58a0 msgr2=0x556029a16ef0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:13.119 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:13.123 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:13.123 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.84 2026-03-23T17:32:13.207 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:13.202+0000 7ff95ee73640 0 -- 192.168.123.104:0/1058207039 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55f9aee93b50 msgr2=0x55f9aeec5650 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T17:32:13.210 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:13.214 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:13.214 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.85 2026-03-23T17:32:13.291 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:13.286+0000 7fd93bfff640 0 -- 192.168.123.104:0/2835431312 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x55e1acf63690 msgr2=0x55e1acf83b10 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:13.295 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:13.298 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:13.299 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.86 2026-03-23T17:32:13.366 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:13.362+0000 7f95eed26640 0 -- 192.168.123.104:0/880012488 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x555c5e304880 msgr2=0x555c5e2f4780 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:13.366 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:13.370 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:13.370 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.87 2026-03-23T17:32:13.436 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:13.430+0000 7f59fe40c640 0 -- 192.168.123.104:0/3800817136 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x5626996f6f20 msgr2=0x5626997286d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:13.439 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:13.442 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:13.442 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.88 2026-03-23T17:32:13.509 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:13.506+0000 7f1c6d8fc640 0 -- 192.168.123.104:0/1527705304 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x564112bc6f20 msgr2=0x564112bf86d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:13.513 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:13.517 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:13.517 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.89 2026-03-23T17:32:13.590 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:13.586+0000 7f0b32f49640 0 -- 192.168.123.104:0/3887757378 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x560483bb4b50 msgr2=0x560483be6650 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:13.593 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:13.597 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:13.597 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.90 2026-03-23T17:32:13.665 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:13.662+0000 7f9239a69640 0 -- 192.168.123.104:0/3497113111 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7f9210008d30 msgr2=0x7f921809e180 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:13.674 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:13.679 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:13.679 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.91 2026-03-23T17:32:13.758 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:13.754+0000 7f01614d9640 0 -- 192.168.123.104:0/3556794282 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55c0054dcf20 msgr2=0x55c00550e6d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:13.758 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:13.764 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:13.764 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.92 2026-03-23T17:32:14.093 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:14.090+0000 7f2bca965640 0 -- 192.168.123.104:0/2472792195 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55b48e633b50 msgr2=0x55b48e665650 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:14.128 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:14.133 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:14.133 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.93 2026-03-23T17:32:14.245 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:14.250 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:14.251 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.94 2026-03-23T17:32:14.346 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:14.351 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:14.351 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.95 2026-03-23T17:32:14.438 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:14.434+0000 7f8423615640 0 -- 192.168.123.104:0/1067207799 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55660e27eb50 msgr2=0x55660e2b0650 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:14.438 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:14.442 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:14.442 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.96 2026-03-23T17:32:14.528 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:14.522+0000 7f773c619640 0 -- 192.168.123.104:0/2462034363 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x56407c117090 msgr2=0x56407c1faca0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:14.532 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:14.538 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:14.539 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.97 2026-03-23T17:32:14.799 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:14.794+0000 7fe4dd105640 0 -- 192.168.123.104:0/3831476430 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x55bb12c69710 msgr2=0x55bb12c89b90 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T17:32:14.804 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:14.809 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:14.809 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.98 2026-03-23T17:32:14.883 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:14.878+0000 7f7350c02640 0 -- 192.168.123.104:0/2080249684 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55894a84d880 msgr2=0x55894a83d780 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T17:32:14.885 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:14.890 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:14.890 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.99 2026-03-23T17:32:14.960 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:14.958+0000 7fe3839b7640 0 -- 192.168.123.104:0/640655016 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x558dffc13b50 msgr2=0x558dffc456b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:14.964 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:14.968 INFO:tasks.workunit.client.0.vm04.stderr:++ seq -w 00 99 2026-03-23T17:32:14.969 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:14.969 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.00 --image-format 2 -s 1 2026-03-23T17:32:15.005 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:15.005 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.01 --image-format 2 -s 1 2026-03-23T17:32:15.042 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:15.043 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.02 --image-format 2 -s 1 2026-03-23T17:32:15.088 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:15.088 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.03 --image-format 2 -s 1 2026-03-23T17:32:15.129 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:15.130 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.04 --image-format 2 -s 1 2026-03-23T17:32:15.175 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:15.176 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.05 --image-format 2 -s 1 2026-03-23T17:32:15.210 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:15.210 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.06 --image-format 2 -s 1 2026-03-23T17:32:15.244 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:15.244 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.07 --image-format 2 -s 1 2026-03-23T17:32:15.282 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:15.282 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.08 --image-format 2 -s 1 2026-03-23T17:32:15.324 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:15.324 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.09 --image-format 2 -s 1 2026-03-23T17:32:15.572 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:15.572 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.10 --image-format 2 -s 1 2026-03-23T17:32:15.610 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:15.610 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.11 --image-format 2 -s 1 2026-03-23T17:32:15.649 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:15.649 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.12 --image-format 2 -s 1 2026-03-23T17:32:15.689 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:15.689 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.13 --image-format 2 -s 1 2026-03-23T17:32:15.734 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:15.734 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.14 --image-format 2 -s 1 2026-03-23T17:32:15.775 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:15.775 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.15 --image-format 2 -s 1 2026-03-23T17:32:15.814 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:15.814 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.16 --image-format 2 -s 1 2026-03-23T17:32:15.863 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:15.863 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.17 --image-format 2 -s 1 2026-03-23T17:32:15.901 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:15.901 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.18 --image-format 2 -s 1 2026-03-23T17:32:15.940 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:15.940 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.19 --image-format 2 -s 1 2026-03-23T17:32:15.980 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:15.980 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.20 --image-format 2 -s 1 2026-03-23T17:32:16.032 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:16.033 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.21 --image-format 2 -s 1 2026-03-23T17:32:16.192 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:16.192 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.22 --image-format 2 -s 1 2026-03-23T17:32:16.297 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:16.297 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.23 --image-format 2 -s 1 2026-03-23T17:32:16.345 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:16.345 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.24 --image-format 2 -s 1 2026-03-23T17:32:16.392 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:16.392 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.25 --image-format 2 -s 1 2026-03-23T17:32:16.427 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:16.427 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.26 --image-format 2 -s 1 2026-03-23T17:32:16.464 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:16.464 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.27 --image-format 2 -s 1 2026-03-23T17:32:16.505 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:16.505 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.28 --image-format 2 -s 1 2026-03-23T17:32:16.549 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:16.549 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.29 --image-format 2 -s 1 2026-03-23T17:32:16.585 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:16.585 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.30 --image-format 2 -s 1 2026-03-23T17:32:16.620 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:16.620 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.31 --image-format 2 -s 1 2026-03-23T17:32:16.657 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:16.657 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.32 --image-format 2 -s 1 2026-03-23T17:32:16.697 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:16.697 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.33 --image-format 2 -s 1 2026-03-23T17:32:16.749 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:16.749 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.34 --image-format 2 -s 1 2026-03-23T17:32:16.789 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:16.789 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.35 --image-format 2 -s 1 2026-03-23T17:32:16.828 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:16.828 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.36 --image-format 2 -s 1 2026-03-23T17:32:16.872 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:16.872 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.37 --image-format 2 -s 1 2026-03-23T17:32:16.911 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:16.911 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.38 --image-format 2 -s 1 2026-03-23T17:32:16.951 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:16.951 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.39 --image-format 2 -s 1 2026-03-23T17:32:16.987 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:16.987 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.40 --image-format 2 -s 1 2026-03-23T17:32:17.023 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:17.023 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.41 --image-format 2 -s 1 2026-03-23T17:32:17.066 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:17.067 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.42 --image-format 2 -s 1 2026-03-23T17:32:17.107 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:17.107 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.43 --image-format 2 -s 1 2026-03-23T17:32:17.143 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:17.143 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.44 --image-format 2 -s 1 2026-03-23T17:32:17.182 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:17.182 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.45 --image-format 2 -s 1 2026-03-23T17:32:17.227 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:17.227 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.46 --image-format 2 -s 1 2026-03-23T17:32:17.268 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:17.268 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.47 --image-format 2 -s 1 2026-03-23T17:32:17.307 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:17.307 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.48 --image-format 2 -s 1 2026-03-23T17:32:17.341 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:17.341 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.49 --image-format 2 -s 1 2026-03-23T17:32:17.376 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:17.377 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.50 --image-format 2 -s 1 2026-03-23T17:32:17.421 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:17.421 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.51 --image-format 2 -s 1 2026-03-23T17:32:17.453 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:17.453 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.52 --image-format 2 -s 1 2026-03-23T17:32:17.488 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:17.488 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.53 --image-format 2 -s 1 2026-03-23T17:32:17.527 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:17.527 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.54 --image-format 2 -s 1 2026-03-23T17:32:17.572 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:17.572 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.55 --image-format 2 -s 1 2026-03-23T17:32:17.615 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:17.615 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.56 --image-format 2 -s 1 2026-03-23T17:32:17.661 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:17.661 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.57 --image-format 2 -s 1 2026-03-23T17:32:17.701 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:17.701 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.58 --image-format 2 -s 1 2026-03-23T17:32:17.736 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:17.736 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.59 --image-format 2 -s 1 2026-03-23T17:32:17.770 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:17.770 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.60 --image-format 2 -s 1 2026-03-23T17:32:18.010 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:18.010 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.61 --image-format 2 -s 1 2026-03-23T17:32:18.042 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:18.042 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.62 --image-format 2 -s 1 2026-03-23T17:32:18.074 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:18.074 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.63 --image-format 2 -s 1 2026-03-23T17:32:18.107 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:18.107 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.64 --image-format 2 -s 1 2026-03-23T17:32:18.141 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:18.141 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.65 --image-format 2 -s 1 2026-03-23T17:32:18.175 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:18.175 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.66 --image-format 2 -s 1 2026-03-23T17:32:18.207 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:18.207 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.67 --image-format 2 -s 1 2026-03-23T17:32:18.240 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:18.240 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.68 --image-format 2 -s 1 2026-03-23T17:32:18.273 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:18.273 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.69 --image-format 2 -s 1 2026-03-23T17:32:18.306 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:18.306 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.70 --image-format 2 -s 1 2026-03-23T17:32:18.339 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:18.339 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.71 --image-format 2 -s 1 2026-03-23T17:32:18.372 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:18.372 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.72 --image-format 2 -s 1 2026-03-23T17:32:18.407 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:18.407 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.73 --image-format 2 -s 1 2026-03-23T17:32:18.439 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:18.439 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.74 --image-format 2 -s 1 2026-03-23T17:32:18.471 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:18.471 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.75 --image-format 2 -s 1 2026-03-23T17:32:18.691 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:18.690+0000 7f232bfff640 0 --2- 192.168.123.104:0/349269237 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x5635d03cd8e0 0x5635d03be050 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 2026-03-23T17:32:18.706 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:18.706 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.76 --image-format 2 -s 1 2026-03-23T17:32:18.738 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:18.738 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.77 --image-format 2 -s 1 2026-03-23T17:32:18.775 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:18.775 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.78 --image-format 2 -s 1 2026-03-23T17:32:18.810 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:18.810 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.79 --image-format 2 -s 1 2026-03-23T17:32:18.840 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:18.840 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.80 --image-format 2 -s 1 2026-03-23T17:32:19.075 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:19.076 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.81 --image-format 2 -s 1 2026-03-23T17:32:19.108 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:19.108 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.82 --image-format 2 -s 1 2026-03-23T17:32:19.140 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:19.140 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.83 --image-format 2 -s 1 2026-03-23T17:32:19.172 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:19.172 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.84 --image-format 2 -s 1 2026-03-23T17:32:19.204 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:19.204 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.85 --image-format 2 -s 1 2026-03-23T17:32:19.237 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:19.237 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.86 --image-format 2 -s 1 2026-03-23T17:32:19.275 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:19.276 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.87 --image-format 2 -s 1 2026-03-23T17:32:19.511 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:19.511 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.88 --image-format 2 -s 1 2026-03-23T17:32:19.547 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:19.547 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.89 --image-format 2 -s 1 2026-03-23T17:32:19.584 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:19.584 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.90 --image-format 2 -s 1 2026-03-23T17:32:19.624 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:19.624 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.91 --image-format 2 -s 1 2026-03-23T17:32:19.659 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:19.659 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.92 --image-format 2 -s 1 2026-03-23T17:32:19.699 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:19.699 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.93 --image-format 2 -s 1 2026-03-23T17:32:19.742 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:19.742 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.94 --image-format 2 -s 1 2026-03-23T17:32:19.775 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:19.775 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.95 --image-format 2 -s 1 2026-03-23T17:32:19.810 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:19.810 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.96 --image-format 2 -s 1 2026-03-23T17:32:19.846 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:19.846 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.97 --image-format 2 -s 1 2026-03-23T17:32:19.885 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:19.885 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.98 --image-format 2 -s 1 2026-03-23T17:32:19.926 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:19.926 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.99 --image-format 2 -s 1 2026-03-23T17:32:19.971 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-23T17:32:19.971 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T17:32:19.971 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 100 2026-03-23T17:32:20.002 INFO:tasks.workunit.client.0.vm04.stdout:100 2026-03-23T17:32:20.002 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls -l 2026-03-23T17:32:20.002 INFO:tasks.workunit.client.0.vm04.stderr:+ grep image 2026-03-23T17:32:20.003 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T17:32:20.003 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 100 2026-03-23T17:32:20.047 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:20.042+0000 7f2ec97d7640 0 -- 192.168.123.104:0/152851816 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7f2ea0008d30 msgr2=0x7f2ea00291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:20.051 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:20.046+0000 7f2ecaa60640 0 -- 192.168.123.104:0/152851816 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55fe7c7bba10 msgr2=0x55fe7c7a9b80 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:20.242 INFO:tasks.workunit.client.0.vm04.stdout:100 2026-03-23T17:32:20.242 INFO:tasks.workunit.client.0.vm04.stderr:++ seq -w 00 99 2026-03-23T17:32:20.243 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:20.243 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.00 2026-03-23T17:32:20.322 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:20.318+0000 7fed73fff640 0 -- 192.168.123.104:0/1192202874 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7fed5805c7d0 msgr2=0x7fed5807cbd0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:20.329 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:20.333 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:20.333 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.01 2026-03-23T17:32:20.395 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:20.390+0000 7fbe4cc29640 0 -- 192.168.123.104:0/852498366 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55d9ab603820 msgr2=0x55d9ab637810 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:20.398 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:20.403 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:20.403 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.02 2026-03-23T17:32:20.464 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:20.458+0000 7f4b16307640 0 -- 192.168.123.104:0/327747728 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7f4af0008d30 msgr2=0x7f4af00291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T17:32:20.468 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:20.471 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:20.471 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.03 2026-03-23T17:32:20.532 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:20.530+0000 7f6b880a3640 0 -- 192.168.123.104:0/643738986 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x555760172090 msgr2=0x555760255ca0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:20.533 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:20.537 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:20.537 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.04 2026-03-23T17:32:20.598 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:20.594+0000 7fb3c2c6d640 0 -- 192.168.123.104:0/3072507108 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55e6e3baa820 msgr2=0x55e6e3bde300 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:20.601 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:20.605 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:20.605 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.05 2026-03-23T17:32:20.671 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:20.666+0000 7f92732f5640 0 -- 192.168.123.104:0/248910968 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55bc46419b50 msgr2=0x55bc4644b6b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:20.671 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:20.675 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:20.675 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.06 2026-03-23T17:32:20.734 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:20.730+0000 7f47bffff640 0 -- 192.168.123.104:0/1100365338 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7f479c008d30 msgr2=0x7f479c0291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T17:32:20.741 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:20.745 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:20.745 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.07 2026-03-23T17:32:20.814 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:20.810+0000 7f490d536640 0 -- 192.168.123.104:0/3646765620 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x565295e2cbf0 msgr2=0x565295e60090 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:20.816 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:20.819 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:20.819 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.08 2026-03-23T17:32:20.883 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:20.886 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:20.886 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.09 2026-03-23T17:32:20.949 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:20.946+0000 7f5279e52640 0 -- 192.168.123.104:0/418175292 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55ef2fdacb50 msgr2=0x55ef2fdde700 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:20.949 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:20.953 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:20.953 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.10 2026-03-23T17:32:21.107 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:21.102+0000 7f63d9010640 0 -- 192.168.123.104:0/112931628 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x560c0950df20 msgr2=0x560c0953f640 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:21.107 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:21.111 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:21.111 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.11 2026-03-23T17:32:21.176 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:21.174+0000 7f031f7fe640 0 -- 192.168.123.104:0/555476192 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7f02fc008d70 msgr2=0x7f02fc0291f0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:21.179 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:21.183 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:21.183 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.12 2026-03-23T17:32:21.241 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:21.238+0000 7fef7474f640 0 -- 192.168.123.104:0/2453171031 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55c23247cf20 msgr2=0x55c2324ae640 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:21.244 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:21.248 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:21.248 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.13 2026-03-23T17:32:21.306 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:21.302+0000 7fa943fff640 0 -- 192.168.123.104:0/1710448268 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x560375aa05a0 msgr2=0x560375ac0a20 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T17:32:21.310 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:21.314 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:21.314 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.14 2026-03-23T17:32:21.377 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:21.374+0000 7f3ae6b21640 0 -- 192.168.123.104:0/2253325461 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x564c14979f20 msgr2=0x564c149ab6f0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:21.383 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:21.387 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:21.387 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.15 2026-03-23T17:32:21.484 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:21.482+0000 7f0e6f2a8640 0 -- 192.168.123.104:0/331658083 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7f0e48008d30 msgr2=0x7f0e480291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:21.488 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:21.493 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:21.494 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.16 2026-03-23T17:32:21.563 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:21.558+0000 7f3eac65a640 0 -- 192.168.123.104:0/1459524749 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x56278bbefb50 msgr2=0x56278bc215b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:21.563 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:21.566 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:21.567 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.17 2026-03-23T17:32:21.632 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:21.626+0000 7f46ad446640 0 -- 192.168.123.104:0/1508548976 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7f4684006fa0 msgr2=0x7f4684027380 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:21.634 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:21.638 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:21.638 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.18 2026-03-23T17:32:21.700 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:21.703 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:21.703 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.19 2026-03-23T17:32:21.760 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:21.758+0000 7f20381bc640 0 -- 192.168.123.104:0/231049734 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55854035bf20 msgr2=0x55854038d640 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:21.760 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:21.764 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:21.764 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.20 2026-03-23T17:32:22.026 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:22.022+0000 7f2e5ebf4640 0 -- 192.168.123.104:0/2841707224 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7f2e3c05c7d0 msgr2=0x7f2e3c07cbd0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:22.033 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:22.036 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:22.037 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.21 2026-03-23T17:32:22.097 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:22.094+0000 7f209f059640 0 -- 192.168.123.104:0/3735194929 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55b45b49eb50 msgr2=0x55b45b4d05b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:22.097 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:22.101 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:22.101 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.22 2026-03-23T17:32:22.160 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:22.158+0000 7f7f4bfff640 0 -- 192.168.123.104:0/320222906 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7f7f38008d30 msgr2=0x7f7f380291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:22.166 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:22.170 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:22.170 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.23 2026-03-23T17:32:22.238 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:22.234+0000 7f88ef328640 0 -- 192.168.123.104:0/461843695 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55e211ca0b50 msgr2=0x55e211cd25b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:22.238 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:22.241 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:22.241 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.24 2026-03-23T17:32:22.304 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:22.302+0000 7fb64a8bd640 0 -- 192.168.123.104:0/2102024491 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x5592b68a6f20 msgr2=0x5592b68d86d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T17:32:22.308 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:22.312 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:22.312 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.25 2026-03-23T17:32:22.383 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:22.378+0000 7f50936b0640 0 -- 192.168.123.104:0/955461529 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7f506c008d70 msgr2=0x7f506c0291f0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:22.387 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:22.392 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:22.392 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.26 2026-03-23T17:32:22.468 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:22.462+0000 7f3924c93640 0 -- 192.168.123.104:0/2271946696 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7f38fc012d70 msgr2=0x7f38fc0131e0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:22.470 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:22.474 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:22.474 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.27 2026-03-23T17:32:22.538 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:22.534+0000 7f3784d7d640 0 -- 192.168.123.104:0/2157201611 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x558274c00f20 msgr2=0x558274c32640 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:22.542 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:22.546 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:22.546 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.28 2026-03-23T17:32:22.611 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:22.606+0000 7fd4d2444640 0 -- 192.168.123.104:0/3908685505 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7fd4ac008d30 msgr2=0x7fd4ac0291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:22.613 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:22.616 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:22.616 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.29 2026-03-23T17:32:22.682 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:22.685 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:22.685 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.30 2026-03-23T17:32:22.748 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:22.746+0000 7fec351c1640 0 -- 192.168.123.104:0/3315607107 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x5638484bb210 msgr2=0x5638485b4910 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:22.752 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:22.755 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:22.755 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.31 2026-03-23T17:32:22.826 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:22.829 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:22.829 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.32 2026-03-23T17:32:22.895 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:22.890+0000 7f6bce720640 0 -- 192.168.123.104:0/3509375777 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55649f160b50 msgr2=0x55649f192670 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:22.895 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:22.899 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:22.899 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.33 2026-03-23T17:32:22.957 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:22.954+0000 7fba990ca640 0 -- 192.168.123.104:0/1041760981 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7fba7c008bf0 msgr2=0x7fba7c008fc0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:22.961 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:22.964 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:22.964 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.34 2026-03-23T17:32:23.032 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:23.035 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:23.035 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.35 2026-03-23T17:32:23.098 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:23.094+0000 7f8acb1c6640 0 -- 192.168.123.104:0/3937962732 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55e66238e090 msgr2=0x55e662471d50 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:23.102 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:23.106 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:23.106 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.36 2026-03-23T17:32:23.167 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:23.162+0000 7fdfd06c8640 0 -- 192.168.123.104:0/2239981644 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x564685360880 msgr2=0x564685350780 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:32:23.169 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:23.173 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:23.173 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.37 2026-03-23T17:32:23.231 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:23.226+0000 7feddfa16640 0 -- 192.168.123.104:0/2332644613 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x559358d8d450 msgr2=0x559358dad8d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T17:32:23.236 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T17:32:23.239 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T17:32:23.239 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.38 2026-03-23T17:32:23.299 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:32:23.294+0000 7f9f3b78c640 0 -- 192.168.123.104:0/360567706 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7f9f1805c5c0 msgr2=0x7f9f1807c9c0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T17:35:23.275 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T17:35:23.274+0000 7f9f3a503640 0 -- 192.168.123.104:0/360567706 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x562e15085b50 msgr2=0x562e14fc0a40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:02:23.307 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:02:23.314 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:02:23.314 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.39 2026-03-23T18:02:23.393 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:02:23.392+0000 7fdefcf7a640 0 -- 192.168.123.104:0/3513898574 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7fdedc05c700 msgr2=0x7fdedc07cb00 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:02:23.595 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:02:23.601 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:02:23.601 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.40 2026-03-23T18:02:23.869 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:02:23.868+0000 7fa1ff906640 0 -- 192.168.123.104:0/412572324 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x55b6f7a4e620 msgr2=0x55b6f7a6eaa0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:02:23.872 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:02:23.876 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:02:23.876 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.41 2026-03-23T18:02:23.949 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:02:23.948+0000 7f3578662640 0 -- 192.168.123.104:0/1360142144 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55fb2f93ff20 msgr2=0x55fb2f971640 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:02:23.952 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:02:23.956 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:02:23.957 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.42 2026-03-23T18:02:24.028 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:02:24.028+0000 7f6b1b719640 0 -- 192.168.123.104:0/2112147754 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7f6af8002a70 msgr2=0x7f6af8002e80 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:02:24.032 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:02:24.037 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:02:24.037 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.43 2026-03-23T18:02:24.105 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:02:24.104+0000 7f9191acd640 0 -- 192.168.123.104:0/4238967026 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7f9174012da0 msgr2=0x7f9174013210 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:02:24.109 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:02:24.113 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:02:24.113 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.44 2026-03-23T18:02:24.192 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:02:24.197 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:02:24.197 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.45 2026-03-23T18:02:24.271 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:02:24.272+0000 7f32cc0bb640 0 -- 192.168.123.104:0/1280549214 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x5641f486f880 msgr2=0x5641f48a4550 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:02:24.273 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:02:24.278 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:02:24.278 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.46 2026-03-23T18:02:24.345 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:02:24.344+0000 7f917df9e640 0 -- 192.168.123.104:0/4144365992 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7f915c05c7d0 msgr2=0x7f915c07cbd0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:02:24.348 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:02:24.352 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:02:24.352 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.47 2026-03-23T18:02:24.424 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:02:24.424+0000 7fe195182640 0 -- 192.168.123.104:0/259705379 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x55ace431ced0 msgr2=0x55ace433d350 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:02:24.430 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:02:24.434 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:02:24.434 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.48 2026-03-23T18:02:24.508 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:02:24.508+0000 7f58b37fe640 0 -- 192.168.123.104:0/3742573039 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7f5890005e60 msgr2=0x7f5890002d80 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:02:24.513 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:02:24.518 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:02:24.518 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.49 2026-03-23T18:02:24.594 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:02:24.596+0000 7f7ab57eb640 0 -- 192.168.123.104:0/3094182270 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x5653961a9b50 msgr2=0x5653961db650 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:02:24.595 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:02:24.598 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:02:24.599 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.50 2026-03-23T18:02:24.671 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:02:24.672+0000 7fb695249640 0 -- 192.168.123.104:0/2955330903 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7fb678008d30 msgr2=0x7fb6780291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:02:24.674 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:02:24.678 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:02:24.678 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.51 2026-03-23T18:02:24.749 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:02:24.748+0000 7f40bb5a4640 0 -- 192.168.123.104:0/1889952068 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55e49a530210 msgr2=0x55e49a629b50 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:02:24.753 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:02:24.756 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:02:24.757 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.52 2026-03-23T18:02:24.826 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:02:24.828+0000 7ff8d3695640 0 -- 192.168.123.104:0/3871910383 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x5556d0b57f20 msgr2=0x5556d0b896d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:02:24.831 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:02:24.835 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:02:24.835 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.53 2026-03-23T18:02:24.905 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:02:24.904+0000 7f99bbaf3640 0 -- 192.168.123.104:0/2100148223 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x561b2af1cf20 msgr2=0x561b2af4e6d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:02:24.909 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:02:24.913 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:02:24.913 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.54 2026-03-23T18:02:24.991 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:02:24.992+0000 7f56a19ef640 0 -- 192.168.123.104:0/3883060660 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55a468d9df20 msgr2=0x55a468dcf640 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:02:24.991 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:02:24.995 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:02:24.995 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.55 2026-03-23T18:02:25.070 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:02:25.074 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:02:25.074 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.56 2026-03-23T18:02:25.154 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:02:25.152+0000 7fc0888dd640 0 -- 192.168.123.104:0/280702901 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55efde8f6880 msgr2=0x55efde8bcd20 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:02:25.158 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:02:25.162 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:02:25.162 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.57 2026-03-23T18:02:25.232 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:02:25.232+0000 7f9d170f9640 0 -- 192.168.123.104:0/2441972224 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7f9cf0013160 msgr2=0x7f9cf00135d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:02:25.236 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:02:25.240 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:02:25.240 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.58 2026-03-23T18:02:25.348 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:02:25.348+0000 7f8cfcdbf640 0 -- 192.168.123.104:0/2793015589 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55a461521820 msgr2=0x55a461555300 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:02:25.352 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:02:25.357 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:02:25.357 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.59 2026-03-23T18:02:25.427 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:02:25.428+0000 7ff86463a640 0 -- 192.168.123.104:0/2762677272 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55ba09901b50 msgr2=0x7ff84409d920 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:02:25.433 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:02:25.437 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:02:25.437 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.60 2026-03-23T18:02:25.511 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:02:25.512+0000 7fb1537fe640 0 -- 192.168.123.104:0/474125232 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7fb130008d30 msgr2=0x7fb1300291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:02:25.515 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:02:25.519 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:02:25.519 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.61 2026-03-23T18:02:25.587 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:02:25.588+0000 7f8cf1c2c640 0 -- 192.168.123.104:0/3095347259 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7f8cd805c5b0 msgr2=0x7f8cd807c9b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:02:25.596 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:02:25.600 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:02:25.600 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.62 2026-03-23T18:02:25.680 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:02:25.680+0000 7f88fdde1640 0 -- 192.168.123.104:0/3354238207 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x560560269b50 msgr2=0x56056029b650 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:02:25.682 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:02:25.686 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:02:25.686 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.63 2026-03-23T18:02:25.760 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:02:25.764 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:02:25.764 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.64 2026-03-23T18:02:25.842 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:02:25.840+0000 7fe096550640 0 -- 192.168.123.104:0/3114295013 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7fe070012df0 msgr2=0x7fe070013260 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:02:25.852 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:02:25.856 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:02:25.856 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.65 2026-03-23T18:02:25.925 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:02:25.924+0000 7f2badf64640 0 -- 192.168.123.104:0/4029477429 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7f2b90012da0 msgr2=0x7f2b90013210 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:02:25.930 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:02:25.934 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:02:25.934 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.66 2026-03-23T18:02:26.011 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:02:26.012+0000 7fd8d9cd8640 0 -- 192.168.123.104:0/928543553 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55edf0ef6f20 msgr2=0x55edf0f28640 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:02:26.011 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:02:26.015 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:02:26.015 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.67 2026-03-23T18:02:26.091 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:02:26.092+0000 7f2f0e121640 0 -- 192.168.123.104:0/2934910556 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x556851e0c790 msgr2=0x556851e2cc10 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:02:26.094 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:02:26.099 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:02:26.099 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.68 2026-03-23T18:02:26.178 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:02:26.176+0000 7f76a606f640 0 -- 192.168.123.104:0/490527839 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7f7680012d70 msgr2=0x7f76800131e0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:02:26.181 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:02:26.185 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:02:26.185 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.69 2026-03-23T18:02:26.258 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:02:26.256+0000 7faf175df640 0 -- 192.168.123.104:0/1387555057 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7faef405c7d0 msgr2=0x7faef407cbd0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:02:26.266 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:02:26.270 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:02:26.270 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.70 2026-03-23T18:02:26.340 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:02:26.340+0000 7f7d37fff640 0 -- 192.168.123.104:0/2717693318 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x560c0615d630 msgr2=0x560c0617dab0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:02:26.348 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:02:26.352 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:02:26.352 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.71 2026-03-23T18:02:26.427 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:02:26.432 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:02:26.432 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.72 2026-03-23T18:02:26.507 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:02:26.508+0000 7f7d43fff640 0 -- 192.168.123.104:0/4049495311 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x560b2f42b680 msgr2=0x560b2f44bb00 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:02:26.512 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:02:26.516 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:02:26.516 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.73 2026-03-23T18:02:26.591 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:02:26.592+0000 7fa22343e640 0 -- 192.168.123.104:0/2826262788 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7fa20005c700 msgr2=0x7fa20007cb00 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:02:26.595 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:02:26.599 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:02:26.599 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.74 2026-03-23T18:02:26.717 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:02:26.716+0000 7fa515d16640 0 -- 192.168.123.104:0/1957224239 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55f5ed34af20 msgr2=0x55f5ed37c640 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:02:26.721 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:02:26.725 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:02:26.725 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.75 2026-03-23T18:02:26.799 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:02:26.800+0000 7f4b49935640 0 -- 192.168.123.104:0/2985002202 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x557961727b50 msgr2=0x5579617596d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:02:26.801 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:02:26.805 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:02:26.805 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.76 2026-03-23T18:02:26.879 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:02:26.880+0000 7f4883703640 0 -- 192.168.123.104:0/973193120 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x555625cb1b50 msgr2=0x555625ce36b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:02:26.880 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:02:26.883 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:02:26.884 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.77 2026-03-23T18:02:26.955 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:02:26.956+0000 7ff299f4b640 0 -- 192.168.123.104:0/523055335 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55fde74c6b50 msgr2=0x55fde74f8700 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:02:26.955 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:02:26.959 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:02:26.959 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.78 2026-03-23T18:02:27.040 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:02:27.040+0000 7fd1a6237640 0 -- 192.168.123.104:0/472331559 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55e24b92f880 msgr2=0x55e24b91f780 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:02:27.044 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:02:27.049 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:02:27.049 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.79 2026-03-23T18:02:27.124 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:02:27.124+0000 7f2fcfc78640 0 -- 192.168.123.104:0/2242911278 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x561d97f57680 msgr2=0x561d97f77b00 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:02:27.129 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:02:27.133 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:02:27.133 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.80 2026-03-23T18:02:27.210 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:02:27.208+0000 7f4b2e5af640 0 -- 192.168.123.104:0/3867445369 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x561e93c39f20 msgr2=0x561e93c6b640 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:02:27.215 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:02:27.219 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:02:27.219 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.81 2026-03-23T18:02:27.284 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:02:27.284+0000 7f064b78b640 0 -- 192.168.123.104:0/828673374 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7f062805c630 msgr2=0x7f062807ca30 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:05:27.259 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:05:27.261+0000 7f064a502640 0 -- 192.168.123.104:0/828673374 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x5618a6365b50 msgr2=0x5618a6356940 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:27.296 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:17:27.299 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:17:27.300 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.82 2026-03-23T18:17:27.360 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:27.357+0000 7fe78347d640 0 -- 192.168.123.104:0/449406298 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x559eb5721b50 msgr2=0x559eb57536e0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:27.362 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:17:27.366 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:17:27.366 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.83 2026-03-23T18:17:27.433 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:27.429+0000 7fcf42e62640 0 -- 192.168.123.104:0/437499046 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x559532ae5f20 msgr2=0x559532b176d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:27.436 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:17:27.440 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:17:27.440 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.84 2026-03-23T18:17:27.501 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:27.497+0000 7fb5eadd6640 0 -- 192.168.123.104:0/3767689555 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7fb5c80072e0 msgr2=0x7fb5c80276c0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:27.707 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:17:27.710 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:17:27.710 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.85 2026-03-23T18:17:27.779 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:27.777+0000 7fdaf68e0640 0 -- 192.168.123.104:0/4275934132 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7fdad0008d30 msgr2=0x7fdad00291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:27.784 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:17:27.787 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:17:27.787 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.86 2026-03-23T18:17:27.854 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:27.849+0000 7f6c237fe640 0 -- 192.168.123.104:0/2837830614 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7f6c04003060 msgr2=0x7f6c040012b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:27.857 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:17:27.861 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:17:27.861 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.87 2026-03-23T18:17:27.920 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:27.917+0000 7f091a606640 0 -- 192.168.123.104:0/1118198756 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55f49fa45820 msgr2=0x55f49fa79810 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:17:27.923 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:17:27.927 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:17:27.927 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.88 2026-03-23T18:17:27.988 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:27.985+0000 7f4a45ab7640 0 -- 192.168.123.104:0/2941411398 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x560cffb90f20 msgr2=0x560cffbc26d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:27.989 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:17:27.993 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:17:27.993 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.89 2026-03-23T18:17:28.053 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:28.049+0000 7fb8155f5640 0 -- 192.168.123.104:0/3282657653 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55b8bd6edb50 msgr2=0x55b8bd71f6b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:28.056 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:17:28.059 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:17:28.059 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.90 2026-03-23T18:17:28.125 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:28.121+0000 7fe67d5c4640 0 -- 192.168.123.104:0/1950005982 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x5588547f4b50 msgr2=0x558854826650 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:28.125 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:17:28.128 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:17:28.128 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.91 2026-03-23T18:17:28.196 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:28.193+0000 7f0c7bb52640 0 -- 192.168.123.104:0/1895615053 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55a658845f20 msgr2=0x55a6588776d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:28.199 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:17:28.203 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:17:28.203 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.92 2026-03-23T18:17:28.265 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:28.261+0000 7f318203a640 0 -- 192.168.123.104:0/3866446538 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55fc23bd2b50 msgr2=0x55fc23c046b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:28.268 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:17:28.272 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:17:28.272 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.93 2026-03-23T18:17:28.336 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:28.333+0000 7f4829fe1640 0 -- 192.168.123.104:0/613636767 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x558db5f87f20 msgr2=0x558db5fb9640 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:17:28.338 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:17:28.342 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:17:28.342 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.94 2026-03-23T18:17:28.421 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:28.417+0000 7f9c65b6f640 0 -- 192.168.123.104:0/482726572 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x5606d103af20 msgr2=0x5606d106c640 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:28.421 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:17:28.424 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:17:28.424 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.95 2026-03-23T18:17:28.488 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:28.485+0000 7fe3a5649640 0 -- 192.168.123.104:0/527729229 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x559166330f20 msgr2=0x5591663626d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:28.489 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:17:28.493 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:17:28.493 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.96 2026-03-23T18:17:28.563 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:28.561+0000 7fa7603d5640 0 -- 192.168.123.104:0/2168465025 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55ffcd414b50 msgr2=0x55ffcd446700 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:28.567 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:17:28.570 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:17:28.570 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.97 2026-03-23T18:17:28.636 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:28.633+0000 7f8d13c0d640 0 -- 192.168.123.104:0/497957695 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55f8b3fb0bf0 msgr2=0x55f8b3fdfa30 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:28.636 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:17:28.639 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:17:28.639 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.98 2026-03-23T18:17:28.704 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:28.701+0000 7f27d51a1640 0 -- 192.168.123.104:0/3099880929 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55d55a3a1b50 msgr2=0x55d55a3d3650 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:28.707 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:17:28.710 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-23T18:17:28.710 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.99 2026-03-23T18:17:28.775 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:17:28.778 INFO:tasks.workunit.client.0.vm04.stderr:+ test_remove 2026-03-23T18:17:28.779 INFO:tasks.workunit.client.0.vm04.stdout:testing remove... 2026-03-23T18:17:28.779 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing remove...' 2026-03-23T18:17:28.779 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-23T18:17:28.779 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:28.839 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:28.902 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:28.963 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:29.023 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:29.086 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:29.148 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:29.210 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:29.278 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:29.340 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:29.404 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:29.463 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:29.525 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:29.586 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:29.647 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:29.705 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:29.764 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:29.825 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:29.890 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd remove NOT_EXIST 2026-03-23T18:17:30.132 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 0% complete...failed. 2026-03-23T18:17:30.132 INFO:tasks.workunit.client.0.vm04.stderr:rbd: delete error: (2) No such file or directory 2026-03-23T18:17:30.136 INFO:tasks.workunit.client.0.vm04.stderr:+ true 2026-03-23T18:17:30.136 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 1 -s 1 test1 2026-03-23T18:17:30.151 INFO:tasks.workunit.client.0.vm04.stderr:rbd: image format 1 is deprecated 2026-03-23T18:17:30.157 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:30.153+0000 7fde27265200 -1 librbd: Forced V1 image creation. 2026-03-23T18:17:30.163 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test1 2026-03-23T18:17:30.193 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:17:30.196 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-23T18:17:30.196 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:17:30.196 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^0$' 2026-03-23T18:17:30.218 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-23T18:17:30.218 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 -s 1 test2 2026-03-23T18:17:30.252 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test2 2026-03-23T18:17:30.321 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:30.317+0000 7fa14540b640 0 -- 192.168.123.104:0/574013799 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55e1d277eb50 msgr2=0x55e1d27b0700 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:30.325 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:17:30.329 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-23T18:17:30.329 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:17:30.329 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^0$' 2026-03-23T18:17:30.553 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-23T18:17:30.553 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 1 -s 1 test1 2026-03-23T18:17:30.569 INFO:tasks.workunit.client.0.vm04.stderr:rbd: image format 1 is deprecated 2026-03-23T18:17:30.575 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:30.573+0000 7fe1d4fac200 -1 librbd: Forced V1 image creation. 2026-03-23T18:17:30.582 INFO:tasks.workunit.client.0.vm04.stderr:+ rados rm -p rbd test1.rbd 2026-03-23T18:17:30.607 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test1 2026-03-23T18:17:30.632 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:17:30.636 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-23T18:17:30.636 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:17:30.636 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^0$' 2026-03-23T18:17:30.659 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-23T18:17:30.659 INFO:tasks.workunit.client.0.vm04.stderr:+ '[' 0 -eq 0 ']' 2026-03-23T18:17:30.659 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 -s 1 test2 2026-03-23T18:17:30.694 INFO:tasks.workunit.client.0.vm04.stderr:++ grep '^rbd_header' 2026-03-23T18:17:30.694 INFO:tasks.workunit.client.0.vm04.stderr:++ rados -p rbd ls 2026-03-23T18:17:30.720 INFO:tasks.workunit.client.0.vm04.stderr:+ HEADER=rbd_header.18cbe4dd69d8 2026-03-23T18:17:30.720 INFO:tasks.workunit.client.0.vm04.stderr:+ rados -p rbd rm rbd_header.18cbe4dd69d8 2026-03-23T18:17:30.744 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test2 2026-03-23T18:17:30.770 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:30.765+0000 7f1212ffd640 -1 librbd::image::OpenRequest: failed to retrieve initial metadata: (2) No such file or directory 2026-03-23T18:17:30.771 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:30.769+0000 7f1212ffd640 -1 librbd::image::OpenRequest: failed to retrieve initial metadata: (2) No such file or directory 2026-03-23T18:17:30.780 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:30.777+0000 7f12137fe640 -1 librbd::image::OpenRequest: failed to retrieve initial metadata: (2) No such file or directory 2026-03-23T18:17:30.789 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:17:30.793 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-23T18:17:30.793 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:17:30.793 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^0$' 2026-03-23T18:17:30.818 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-23T18:17:30.818 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 -s 1 test2 2026-03-23T18:17:30.851 INFO:tasks.workunit.client.0.vm04.stderr:+ rados -p rbd rm rbd_id.test2 2026-03-23T18:17:30.873 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test2 2026-03-23T18:17:30.934 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:30.929+0000 7fe8ec8b7640 0 -- 192.168.123.104:0/2864974571 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x55c3ade3d520 msgr2=0x55c3ade5d9a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:30.937 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:17:30.941 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-23T18:17:30.941 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:17:30.941 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^0$' 2026-03-23T18:17:30.964 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-23T18:17:30.964 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 -s 1 test2 2026-03-23T18:17:30.998 INFO:tasks.workunit.client.0.vm04.stderr:++ rados -p rbd ls 2026-03-23T18:17:30.998 INFO:tasks.workunit.client.0.vm04.stderr:++ grep '^rbd_header' 2026-03-23T18:17:31.023 INFO:tasks.workunit.client.0.vm04.stderr:+ HEADER=rbd_header.18e93473e07d 2026-03-23T18:17:31.023 INFO:tasks.workunit.client.0.vm04.stderr:+ rados -p rbd rm rbd_header.18e93473e07d 2026-03-23T18:17:31.045 INFO:tasks.workunit.client.0.vm04.stderr:+ rados -p rbd rm rbd_id.test2 2026-03-23T18:17:31.067 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test2 2026-03-23T18:17:31.089 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:31.085+0000 7f7f13fff640 -1 librbd::image::OpenRequest: failed to retrieve initial metadata: (2) No such file or directory 2026-03-23T18:17:31.090 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:31.085+0000 7f7f13fff640 -1 librbd::image::OpenRequest: failed to retrieve initial metadata: (2) No such file or directory 2026-03-23T18:17:31.099 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:31.097+0000 7f7f20a29640 -1 librbd::image::OpenRequest: failed to retrieve initial metadata: (2) No such file or directory 2026-03-23T18:17:31.105 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:17:31.109 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-23T18:17:31.109 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:17:31.109 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^0$' 2026-03-23T18:17:31.131 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-23T18:17:31.131 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 -s 1 test2 2026-03-23T18:17:31.165 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create test2@snap 2026-03-23T18:17:32.076 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-23T18:17:32.083 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap protect test2@snap 2026-03-23T18:17:32.114 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone test2@snap clone --rbd-default-clone-format 1 2026-03-23T18:17:32.159 INFO:tasks.workunit.client.0.vm04.stderr:+ rados -p rbd rm rbd_children 2026-03-23T18:17:32.183 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm clone 2026-03-23T18:17:32.243 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:32.241+0000 7f81bfccd640 0 -- 192.168.123.104:0/4046545661 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x5641e0321f20 msgr2=0x5641e0353700 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:32.250 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:17:32.254 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-23T18:17:32.254 INFO:tasks.workunit.client.0.vm04.stderr:+ grep clone 2026-03-23T18:17:32.254 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:17:32.254 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^0$' 2026-03-23T18:17:32.277 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-23T18:17:32.277 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap unprotect test2@snap 2026-03-23T18:17:32.311 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm test2@snap 2026-03-23T18:17:33.083 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-23T18:17:33.090 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test2 2026-03-23T18:17:33.149 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:33.145+0000 7f46d12bb640 0 -- 192.168.123.104:0/1482441628 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7f46b005c740 msgr2=0x7f46b007cb40 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:33.159 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:17:33.164 INFO:tasks.workunit.client.0.vm04.stdout:testing migration... 2026-03-23T18:17:33.164 INFO:tasks.workunit.client.0.vm04.stderr:+ test_migration 2026-03-23T18:17:33.164 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing migration...' 2026-03-23T18:17:33.164 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-23T18:17:33.164 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:33.234 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:33.297 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:33.361 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:33.426 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:33.490 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:33.559 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:33.622 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:33.687 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:33.751 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:33.811 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:33.874 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:33.936 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:33.997 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:34.059 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:34.121 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:34.191 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:34.253 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:34.318 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool create rbd2 8 2026-03-23T18:17:35.188 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd2' already exists 2026-03-23T18:17:35.200 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd pool init rbd2 2026-03-23T18:17:38.156 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 1 -s 128M test1 2026-03-23T18:17:38.171 INFO:tasks.workunit.client.0.vm04.stderr:rbd: image format 1 is deprecated 2026-03-23T18:17:38.177 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:38.174+0000 7ff723d43200 -1 librbd: Forced V1 image creation. 2026-03-23T18:17:38.184 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info test1 2026-03-23T18:17:38.185 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'format: 1' 2026-03-23T18:17:38.209 INFO:tasks.workunit.client.0.vm04.stdout: format: 1 2026-03-23T18:17:38.209 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration prepare test1 --image-format 2 2026-03-23T18:17:38.258 INFO:tasks.workunit.client.0.vm04.stderr:++ get_migration_state test1 2026-03-23T18:17:38.258 INFO:tasks.workunit.client.0.vm04.stderr:++ local image=test1 2026-03-23T18:17:38.258 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd --format xml status test1 2026-03-23T18:17:38.258 INFO:tasks.workunit.client.0.vm04.stderr:++ xmlstarlet sel -t -v //status/migration/state 2026-03-23T18:17:38.312 INFO:tasks.workunit.client.0.vm04.stderr:+ test prepared = prepared 2026-03-23T18:17:38.312 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info test1 2026-03-23T18:17:38.312 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'format: 2' 2026-03-23T18:17:38.341 INFO:tasks.workunit.client.0.vm04.stdout: format: 2 2026-03-23T18:17:38.341 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test1 2026-03-23T18:17:38.367 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:38.366+0000 7fc9a0b52200 -1 librbd::image::PreRemoveRequest: 0x56212ba3d200 validate_image_removal: image in migration state - not removing 2026-03-23T18:17:38.368 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 0% complete...failed. 2026-03-23T18:17:38.368 INFO:tasks.workunit.client.0.vm04.stderr:rbd: error: image still has watchers 2026-03-23T18:17:38.368 INFO:tasks.workunit.client.0.vm04.stderr:This means the image is still open or the client using it crashed. Try again after closing/unmapping it or waiting 30s for the crashed client to timeout. 2026-03-23T18:17:38.371 INFO:tasks.workunit.client.0.vm04.stderr:+ true 2026-03-23T18:17:38.371 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration execute test1 2026-03-23T18:17:38.428 INFO:tasks.workunit.client.0.vm04.stderr: Image migration: 3% complete... Image migration: 6% complete... Image migration: 9% complete... Image migration: 12% complete... Image migration: 15% complete... Image migration: 18% complete... Image migration: 21% complete... Image migration: 25% complete... Image migration: 28% complete... Image migration: 31% complete... Image migration: 34% complete... Image migration: 37% complete... Image migration: 40% complete... Image migration: 43% complete... Image migration: 46% complete... Image migration: 50% complete... Image migration: 53% complete... Image migration: 56% complete... Image migration: 59% complete... Image migration: 62% complete... Image migration: 65% complete... Image migration: 68% complete... Image migration: 71% complete... Image migration: 75% complete... Image migration: 78% complete... Image migration: 81% complete... Image migration: 84% complete... Image migration: 87% complete... Image migration: 90% complete... Image migration: 93% complete... Image migration: 96% complete...2026-03-23T18:17:38.426+0000 7fe840c27640 0 -- 192.168.123.104:0/1254809470 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x559fb408f5b0 msgr2=0x559fb411f180 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:38.432 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:38.430+0000 7fe840c27640 0 -- 192.168.123.104:0/1254809470 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7fe82005cfb0 msgr2=0x7fe82007d3b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:38.436 INFO:tasks.workunit.client.0.vm04.stderr: Image migration: 100% complete...done. 2026-03-23T18:17:38.440 INFO:tasks.workunit.client.0.vm04.stderr:++ get_migration_state test1 2026-03-23T18:17:38.440 INFO:tasks.workunit.client.0.vm04.stderr:++ local image=test1 2026-03-23T18:17:38.440 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd --format xml status test1 2026-03-23T18:17:38.440 INFO:tasks.workunit.client.0.vm04.stderr:++ xmlstarlet sel -t -v //status/migration/state 2026-03-23T18:17:38.486 INFO:tasks.workunit.client.0.vm04.stderr:+ test executed = executed 2026-03-23T18:17:38.486 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration commit test1 2026-03-23T18:17:38.542 INFO:tasks.workunit.client.0.vm04.stderr: Commit image migration: 3% complete... Commit image migration: 6% complete... Commit image migration: 9% complete... Commit image migration: 12% complete... Commit image migration: 15% complete... Commit image migration: 18% complete... Commit image migration: 21% complete... Commit image migration: 25% complete... Commit image migration: 28% complete... Commit image migration: 31% complete...2026-03-23T18:17:38.538+0000 7f134451b640 0 -- 192.168.123.104:0/1728512560 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x556bfce163d0 msgr2=0x556bfcf57370 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:38.548 INFO:tasks.workunit.client.0.vm04.stderr: Commit image migration: 34% complete... Commit image migration: 37% complete... Commit image migration: 40% complete... Commit image migration: 43% complete... Commit image migration: 46% complete... Commit image migration: 50% complete... Commit image migration: 53% complete... Commit image migration: 56% complete... Commit image migration: 59% complete... Commit image migration: 62% complete... Commit image migration: 65% complete... Commit image migration: 68% complete... Commit image migration: 71% complete... Commit image migration: 75% complete... Commit image migration: 78% complete... Commit image migration: 81% complete... Commit image migration: 84% complete... Commit image migration: 87% complete... Commit image migration: 90% complete... Commit image migration: 93% complete...2026-03-23T18:17:38.546+0000 7f1343292640 0 -- 192.168.123.104:0/1728512560 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7f131c008d30 msgr2=0x7f131c0291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:38.555 INFO:tasks.workunit.client.0.vm04.stderr: Commit image migration: 96% complete... Commit image migration: 100% complete...done. 2026-03-23T18:17:38.559 INFO:tasks.workunit.client.0.vm04.stderr:+ get_migration_state test1 2026-03-23T18:17:38.559 INFO:tasks.workunit.client.0.vm04.stderr:+ local image=test1 2026-03-23T18:17:38.559 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd --format xml status test1 2026-03-23T18:17:38.559 INFO:tasks.workunit.client.0.vm04.stderr:+ xmlstarlet sel -t -v //status/migration/state 2026-03-23T18:17:38.585 INFO:tasks.workunit.client.0.vm04.stderr:+ true 2026-03-23T18:17:38.586 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info test1 2026-03-23T18:17:38.586 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'features: .*layering' 2026-03-23T18:17:38.612 INFO:tasks.workunit.client.0.vm04.stderr:+ true 2026-03-23T18:17:38.612 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration prepare test1 --image-feature layering,exclusive-lock,object-map,fast-diff,deep-flatten 2026-03-23T18:17:38.676 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info test1 2026-03-23T18:17:38.676 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'features: .*layering' 2026-03-23T18:17:38.707 INFO:tasks.workunit.client.0.vm04.stdout: features: layering, exclusive-lock, object-map, fast-diff, deep-flatten, data-pool, migrating 2026-03-23T18:17:38.707 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration execute test1 2026-03-23T18:17:38.752 INFO:tasks.workunit.client.0.vm04.stderr: Image migration: 3% complete... Image migration: 6% complete... Image migration: 9% complete... Image migration: 12% complete... Image migration: 15% complete... Image migration: 18% complete... Image migration: 21% complete... Image migration: 25% complete... Image migration: 28% complete... Image migration: 31% complete... Image migration: 34% complete... Image migration: 37% complete... Image migration: 40% complete... Image migration: 43% complete... Image migration: 46% complete... Image migration: 50% complete... Image migration: 53% complete... Image migration: 56% complete... Image migration: 59% complete... Image migration: 62% complete... Image migration: 65% complete... Image migration: 68% complete... Image migration: 71% complete... Image migration: 75% complete... Image migration: 78% complete... Image migration: 81% complete... Image migration: 84% complete... Image migration: 87% complete... Image migration: 90% complete... Image migration: 93% complete... Image migration: 96% complete...2026-03-23T18:17:38.750+0000 7f7b5edac640 0 -- 192.168.123.104:0/2194587581 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x7f7b3c004a30 msgr2=0x7f7b3c024e10 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:17:38.759 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:38.758+0000 7f7b60035640 0 -- 192.168.123.104:0/2194587581 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x561d52bd85b0 msgr2=0x561d52c6abc0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:38.761 INFO:tasks.workunit.client.0.vm04.stderr: Image migration: 100% complete...done. 2026-03-23T18:17:38.765 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration commit test1 2026-03-23T18:17:38.819 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:38.818+0000 7faa99a75640 0 -- 192.168.123.104:0/339714046 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x5598350ff8c0 msgr2=0x559835195b20 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:38.826 INFO:tasks.workunit.client.0.vm04.stderr: Commit image migration: 3% complete... Commit image migration: 6% complete... Commit image migration: 9% complete... Commit image migration: 12% complete... Commit image migration: 15% complete... Commit image migration: 18% complete... Commit image migration: 21% complete... Commit image migration: 25% complete... Commit image migration: 28% complete... Commit image migration: 31% complete... Commit image migration: 34% complete... Commit image migration: 37% complete... Commit image migration: 40% complete...2026-03-23T18:17:38.822+0000 7faa93fff640 0 -- 192.168.123.104:0/339714046 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x7faa74004a30 msgr2=0x7faa74024e10 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:38.844 INFO:tasks.workunit.client.0.vm04.stderr: Commit image migration: 43% complete... Commit image migration: 46% complete... Commit image migration: 50% complete... Commit image migration: 53% complete... Commit image migration: 56% complete... Commit image migration: 59% complete... Commit image migration: 62% complete... Commit image migration: 65% complete... Commit image migration: 68% complete... Commit image migration: 71% complete... Commit image migration: 75% complete... Commit image migration: 78% complete... Commit image migration: 81% complete... Commit image migration: 84% complete... Commit image migration: 87% complete... Commit image migration: 90% complete... Commit image migration: 93% complete... Commit image migration: 96% complete... Commit image migration: 100% complete...done. 2026-03-23T18:17:38.848 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration prepare test1 rbd2/test1 2026-03-23T18:17:38.910 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:38.906+0000 7f793fcf4640 0 -- 192.168.123.104:0/1938941726 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x7f791c004a30 msgr2=0x7f791c024e10 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:38.917 INFO:tasks.workunit.client.0.vm04.stderr:++ get_migration_state rbd2/test1 2026-03-23T18:17:38.917 INFO:tasks.workunit.client.0.vm04.stderr:++ local image=rbd2/test1 2026-03-23T18:17:38.917 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd --format xml status rbd2/test1 2026-03-23T18:17:38.917 INFO:tasks.workunit.client.0.vm04.stderr:++ xmlstarlet sel -t -v //status/migration/state 2026-03-23T18:17:38.969 INFO:tasks.workunit.client.0.vm04.stderr:+ test prepared = prepared 2026-03-23T18:17:38.969 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-23T18:17:38.969 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:17:38.969 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^0$' 2026-03-23T18:17:38.992 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-23T18:17:38.992 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd -p rbd2 ls 2026-03-23T18:17:38.992 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test1 2026-03-23T18:17:39.015 INFO:tasks.workunit.client.0.vm04.stdout:test1 2026-03-23T18:17:39.015 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration execute test1 2026-03-23T18:17:39.065 INFO:tasks.workunit.client.0.vm04.stderr: Image migration: 3% complete... Image migration: 6% complete... Image migration: 9% complete... Image migration: 12% complete... Image migration: 15% complete... Image migration: 18% complete... Image migration: 21% complete... Image migration: 25% complete... Image migration: 28% complete... Image migration: 31% complete... Image migration: 34% complete... Image migration: 37% complete... Image migration: 40% complete... Image migration: 43% complete... Image migration: 46% complete... Image migration: 50% complete... Image migration: 53% complete... Image migration: 56% complete... Image migration: 59% complete... Image migration: 62% complete... Image migration: 65% complete... Image migration: 68% complete... Image migration: 71% complete... Image migration: 75% complete... Image migration: 78% complete... Image migration: 81% complete... Image migration: 84% complete... Image migration: 87% complete... Image migration: 90% complete... Image migration: 93% complete... Image migration: 96% complete...2026-03-23T18:17:39.062+0000 7f897f452640 0 -- 192.168.123.104:0/3014685261 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x7f895c004cf0 msgr2=0x7f895c0250d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:39.068 INFO:tasks.workunit.client.0.vm04.stderr: Image migration: 100% complete...done. 2026-03-23T18:17:39.071 INFO:tasks.workunit.client.0.vm04.stderr:++ get_migration_state rbd2/test1 2026-03-23T18:17:39.071 INFO:tasks.workunit.client.0.vm04.stderr:++ local image=rbd2/test1 2026-03-23T18:17:39.071 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd --format xml status rbd2/test1 2026-03-23T18:17:39.072 INFO:tasks.workunit.client.0.vm04.stderr:++ xmlstarlet sel -t -v //status/migration/state 2026-03-23T18:17:39.117 INFO:tasks.workunit.client.0.vm04.stderr:+ test executed = executed 2026-03-23T18:17:39.117 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm rbd2/test1 2026-03-23T18:17:39.146 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:39.142+0000 7f94677fe640 -1 librbd::image::PreRemoveRequest: 0x559f7b739750 validate_image_removal: image in migration state - not removing 2026-03-23T18:17:39.150 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 0% complete...failed. 2026-03-23T18:17:39.150 INFO:tasks.workunit.client.0.vm04.stderr:rbd: error: image still has watchers 2026-03-23T18:17:39.150 INFO:tasks.workunit.client.0.vm04.stderr:This means the image is still open or the client using it crashed. Try again after closing/unmapping it or waiting 30s for the crashed client to timeout. 2026-03-23T18:17:39.154 INFO:tasks.workunit.client.0.vm04.stderr:+ true 2026-03-23T18:17:39.154 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration commit test1 2026-03-23T18:17:39.223 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:39.222+0000 7efff42c8640 0 -- 192.168.123.104:0/2655376468 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x564b6dd2b8c0 msgr2=0x564b6ddc1b60 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:39.236 INFO:tasks.workunit.client.0.vm04.stderr: Commit image migration: 3% complete... Commit image migration: 6% complete... Commit image migration: 9% complete... Commit image migration: 12% complete... Commit image migration: 15% complete... Commit image migration: 18% complete... Commit image migration: 21% complete... Commit image migration: 25% complete... Commit image migration: 28% complete... Commit image migration: 31% complete... Commit image migration: 34% complete... Commit image migration: 37% complete... Commit image migration: 40% complete... Commit image migration: 43% complete... Commit image migration: 46% complete... Commit image migration: 50% complete... Commit image migration: 53% complete... Commit image migration: 56% complete... Commit image migration: 59% complete... Commit image migration: 62% complete... Commit image migration: 65% complete... Commit image migration: 68% complete... Commit image migration: 71% complete... Commit image migration: 75% complete... Commit image migration: 78% complete... Commit image migration: 81% complete... Commit image migration: 84% complete... Commit image migration: 87% complete... Commit image migration: 90% complete... Commit image migration: 93% complete... Commit image migration: 96% complete... Commit image migration: 100% complete...done. 2026-03-23T18:17:39.241 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd namespace create rbd2/ns1 2026-03-23T18:17:39.270 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd namespace create rbd2/ns2 2026-03-23T18:17:39.297 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration prepare rbd2/test1 rbd2/ns1/test1 2026-03-23T18:17:39.355 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:39.354+0000 7f8d7fbd3640 0 -- 192.168.123.104:0/494701647 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x7f8d6005cdb0 msgr2=0x7f8d6007d1b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:39.562 INFO:tasks.workunit.client.0.vm04.stderr:++ get_migration_state rbd2/ns1/test1 2026-03-23T18:17:39.562 INFO:tasks.workunit.client.0.vm04.stderr:++ local image=rbd2/ns1/test1 2026-03-23T18:17:39.562 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd --format xml status rbd2/ns1/test1 2026-03-23T18:17:39.562 INFO:tasks.workunit.client.0.vm04.stderr:++ xmlstarlet sel -t -v //status/migration/state 2026-03-23T18:17:39.611 INFO:tasks.workunit.client.0.vm04.stderr:+ test prepared = prepared 2026-03-23T18:17:39.611 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration execute rbd2/test1 2026-03-23T18:17:39.657 INFO:tasks.workunit.client.0.vm04.stderr: Image migration: 3% complete... Image migration: 6% complete... Image migration: 9% complete... Image migration: 12% complete... Image migration: 15% complete... Image migration: 18% complete... Image migration: 21% complete... Image migration: 25% complete... Image migration: 28% complete... Image migration: 31% complete... Image migration: 34% complete... Image migration: 37% complete... Image migration: 40% complete... Image migration: 43% complete... Image migration: 46% complete... Image migration: 50% complete... Image migration: 53% complete... Image migration: 56% complete... Image migration: 59% complete... Image migration: 62% complete... Image migration: 65% complete... Image migration: 68% complete... Image migration: 71% complete... Image migration: 75% complete... Image migration: 78% complete... Image migration: 81% complete... Image migration: 84% complete... Image migration: 87% complete... Image migration: 90% complete... Image migration: 93% complete... Image migration: 96% complete...2026-03-23T18:17:39.654+0000 7f43db5c0640 0 -- 192.168.123.104:0/751939944 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x557695c195b0 msgr2=0x557695cab520 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:39.658 INFO:tasks.workunit.client.0.vm04.stderr: Image migration: 100% complete...done. 2026-03-23T18:17:39.662 INFO:tasks.workunit.client.0.vm04.stderr:++ get_migration_state rbd2/ns1/test1 2026-03-23T18:17:39.662 INFO:tasks.workunit.client.0.vm04.stderr:++ local image=rbd2/ns1/test1 2026-03-23T18:17:39.662 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd --format xml status rbd2/ns1/test1 2026-03-23T18:17:39.662 INFO:tasks.workunit.client.0.vm04.stderr:++ xmlstarlet sel -t -v //status/migration/state 2026-03-23T18:17:39.704 INFO:tasks.workunit.client.0.vm04.stderr:+ test executed = executed 2026-03-23T18:17:39.704 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration commit rbd2/test1 2026-03-23T18:17:39.753 INFO:tasks.workunit.client.0.vm04.stderr: Commit image migration: 3% complete... Commit image migration: 6% complete... Commit image migration: 9% complete... Commit image migration: 12% complete... Commit image migration: 15% complete... Commit image migration: 18% complete... Commit image migration: 21% complete... Commit image migration: 25% complete... Commit image migration: 28% complete... Commit image migration: 31% complete... Commit image migration: 34% complete... Commit image migration: 37% complete... Commit image migration: 40% complete... Commit image migration: 43% complete... Commit image migration: 46% complete... Commit image migration: 50% complete... Commit image migration: 53% complete... Commit image migration: 56% complete... Commit image migration: 59% complete... Commit image migration: 62% complete... Commit image migration: 65% complete... Commit image migration: 68% complete... Commit image migration: 71% complete... Commit image migration: 75% complete... Commit image migration: 78% complete... Commit image migration: 81% complete... Commit image migration: 84% complete... Commit image migration: 87% complete... Commit image migration: 90% complete... Commit image migration: 93% complete... Commit image migration: 96% complete...2026-03-23T18:17:39.750+0000 7f4852a55640 0 -- 192.168.123.104:0/918970190 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x564df7dadc90 msgr2=0x564df7e43330 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:39.759 INFO:tasks.workunit.client.0.vm04.stderr: Commit image migration: 100% complete...done. 2026-03-23T18:17:39.763 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration prepare rbd2/ns1/test1 rbd2/ns2/test1 2026-03-23T18:17:39.824 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:39.822+0000 7fb53dce6640 0 -- 192.168.123.104:0/3182115226 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x55ae5f7e30c0 msgr2=0x55ae5f8c58a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:39.829 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration execute rbd2/ns2/test1 2026-03-23T18:17:39.875 INFO:tasks.workunit.client.0.vm04.stderr: Image migration: 3% complete... Image migration: 6% complete... Image migration: 9% complete... Image migration: 12% complete... Image migration: 15% complete... Image migration: 18% complete... Image migration: 21% complete... Image migration: 25% complete... Image migration: 28% complete... Image migration: 31% complete... Image migration: 34% complete... Image migration: 37% complete... Image migration: 40% complete... Image migration: 43% complete... Image migration: 46% complete... Image migration: 50% complete... Image migration: 53% complete... Image migration: 56% complete... Image migration: 59% complete... Image migration: 62% complete... Image migration: 65% complete... Image migration: 68% complete... Image migration: 71% complete... Image migration: 75% complete... Image migration: 78% complete... Image migration: 81% complete... Image migration: 84% complete... Image migration: 87% complete... Image migration: 90% complete... Image migration: 93% complete... Image migration: 96% complete...2026-03-23T18:17:39.874+0000 7f3d5bfff640 0 -- 192.168.123.104:0/2002844390 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x560a4e4f5b20 msgr2=0x560a4e515f00 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:17:39.882 INFO:tasks.workunit.client.0.vm04.stderr: Image migration: 100% complete...done. 2026-03-23T18:17:39.886 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration commit rbd2/ns2/test1 2026-03-23T18:17:39.939 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:39.938+0000 7f84e7bbd640 0 -- 192.168.123.104:0/611854876 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x55bb1d46a880 msgr2=0x55bb1d48ac60 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:17:39.944 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:39.942+0000 7f84e7bbd640 0 -- 192.168.123.104:0/611854876 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x7f84c805cf50 msgr2=0x7f84c807d350 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:39.952 INFO:tasks.workunit.client.0.vm04.stderr: Commit image migration: 3% complete... Commit image migration: 6% complete... Commit image migration: 9% complete... Commit image migration: 12% complete... Commit image migration: 15% complete... Commit image migration: 18% complete... Commit image migration: 21% complete... Commit image migration: 25% complete... Commit image migration: 28% complete... Commit image migration: 31% complete... Commit image migration: 34% complete... Commit image migration: 37% complete... Commit image migration: 40% complete... Commit image migration: 43% complete... Commit image migration: 46% complete... Commit image migration: 50% complete... Commit image migration: 53% complete... Commit image migration: 56% complete... Commit image migration: 59% complete... Commit image migration: 62% complete... Commit image migration: 65% complete... Commit image migration: 68% complete... Commit image migration: 71% complete... Commit image migration: 75% complete... Commit image migration: 78% complete... Commit image migration: 81% complete... Commit image migration: 84% complete... Commit image migration: 87% complete... Commit image migration: 90% complete... Commit image migration: 93% complete... Commit image migration: 96% complete... Commit image migration: 100% complete...done. 2026-03-23T18:17:39.956 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create -s 128M test1 2026-03-23T18:17:39.990 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration prepare test1 --data-pool rbd2 2026-03-23T18:17:40.064 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:40.062+0000 7f11e3fff640 0 -- 192.168.123.104:0/3563314968 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7f11c4025a40 msgr2=0x7f11c404a6d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:40.071 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info test1 2026-03-23T18:17:40.071 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'data_pool: rbd2' 2026-03-23T18:17:40.101 INFO:tasks.workunit.client.0.vm04.stdout: data_pool: rbd2 2026-03-23T18:17:40.101 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration execute test1 2026-03-23T18:17:40.161 INFO:tasks.workunit.client.0.vm04.stderr: Image migration: 3% complete... Image migration: 6% complete... Image migration: 9% complete... Image migration: 12% complete... Image migration: 15% complete... Image migration: 18% complete... Image migration: 21% complete... Image migration: 25% complete... Image migration: 28% complete... Image migration: 31% complete... Image migration: 34% complete... Image migration: 37% complete... Image migration: 40% complete... Image migration: 43% complete... Image migration: 46% complete... Image migration: 50% complete... Image migration: 53% complete... Image migration: 56% complete... Image migration: 59% complete... Image migration: 62% complete... Image migration: 65% complete... Image migration: 68% complete... Image migration: 71% complete... Image migration: 75% complete... Image migration: 78% complete... Image migration: 81% complete... Image migration: 84% complete... Image migration: 87% complete... Image migration: 90% complete... Image migration: 93% complete... Image migration: 96% complete...2026-03-23T18:17:40.158+0000 7fa438fcc640 0 -- 192.168.123.104:0/3233212469 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7fa410008d30 msgr2=0x7fa4100291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:40.175 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:40.174+0000 7fa438fcc640 0 -- 192.168.123.104:0/3233212469 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7fa41805cd30 msgr2=0x7fa41807d130 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:40.177 INFO:tasks.workunit.client.0.vm04.stderr: Image migration: 100% complete...done. 2026-03-23T18:17:40.181 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration commit test1 2026-03-23T18:17:40.235 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:40.234+0000 7f8719167640 0 -- 192.168.123.104:0/4035444256 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7f86f0008d30 msgr2=0x7f86f00291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:17:40.241 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:40.238+0000 7f871a3f0640 0 -- 192.168.123.104:0/4035444256 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x5633372898c0 msgr2=0x56333731faa0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:40.252 INFO:tasks.workunit.client.0.vm04.stderr: Commit image migration: 3% complete... Commit image migration: 6% complete... Commit image migration: 9% complete... Commit image migration: 12% complete... Commit image migration: 15% complete... Commit image migration: 18% complete... Commit image migration: 21% complete... Commit image migration: 25% complete... Commit image migration: 28% complete... Commit image migration: 31% complete... Commit image migration: 34% complete... Commit image migration: 37% complete... Commit image migration: 40% complete... Commit image migration: 43% complete... Commit image migration: 46% complete... Commit image migration: 50% complete... Commit image migration: 53% complete... Commit image migration: 56% complete... Commit image migration: 59% complete... Commit image migration: 62% complete... Commit image migration: 65% complete... Commit image migration: 68% complete... Commit image migration: 71% complete... Commit image migration: 75% complete... Commit image migration: 78% complete... Commit image migration: 81% complete... Commit image migration: 84% complete... Commit image migration: 87% complete... Commit image migration: 90% complete... Commit image migration: 93% complete... Commit image migration: 96% complete... Commit image migration: 100% complete...done. 2026-03-23T18:17:40.256 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration prepare test1 2026-03-23T18:17:40.324 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:40.322+0000 7f75da03a640 0 -- 192.168.123.104:0/3407402859 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55a7f7ab7aa0 msgr2=0x55a7f7bf8f20 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:40.331 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd trash mv test1 2026-03-23T18:17:40.331 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv test1 2026-03-23T18:17:40.362 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:40.358+0000 7f9da6664200 -1 librbd::api::Trash: move: cannot move migrating image to trash 2026-03-23T18:17:40.365 INFO:tasks.workunit.client.0.vm04.stderr:rbd: deferred delete error: (16) Device or resource busy 2026-03-23T18:17:40.368 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:17:40.369 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash ls -a 2026-03-23T18:17:40.369 INFO:tasks.workunit.client.0.vm04.stderr:++ cut -d ' ' -f 1 2026-03-23T18:17:40.392 INFO:tasks.workunit.client.0.vm04.stderr:+ ID=19efd8065d90 2026-03-23T18:17:40.392 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd trash rm 19efd8065d90 2026-03-23T18:17:40.392 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash rm 19efd8065d90 2026-03-23T18:17:40.419 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:40.418+0000 7fd43affd640 -1 librbd::image::RefreshRequest: image being migrated 2026-03-23T18:17:40.419 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:40.418+0000 7fd43affd640 -1 librbd::image::OpenRequest: failed to refresh image: (30) Read-only file system 2026-03-23T18:17:40.419 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:40.418+0000 7fd43affd640 -1 librbd::ImageState: 0x7fd41c03c140 failed to open image: (30) Read-only file system 2026-03-23T18:17:40.419 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:40.418+0000 7fd4257fa640 -1 librbd::image::RemoveRequest: 0x7fd41c000b90 handle_open_image: error opening image: (30) Read-only file system 2026-03-23T18:17:40.428 INFO:tasks.workunit.client.0.vm04.stderr:rbd: remove error: (30) Read-only file system 2026-03-23T18:17:40.429 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 0% complete...failed. 2026-03-23T18:17:40.432 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:17:40.432 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd trash restore 19efd8065d90 2026-03-23T18:17:40.432 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash restore 19efd8065d90 2026-03-23T18:17:40.453 INFO:tasks.workunit.client.0.vm04.stderr:rbd: restore error: 2026-03-23T18:17:40.450+0000 7f508cc24200 -1 librbd::api::Trash: restore: Current trash source 'migration' does not match expected: user,mirroring,unknown (4) 2026-03-23T18:17:40.453 INFO:tasks.workunit.client.0.vm04.stderr:(22) Invalid argument 2026-03-23T18:17:40.456 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:17:40.456 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration abort test1 2026-03-23T18:17:40.520 INFO:tasks.workunit.client.0.vm04.stderr: Abort image migration: 1% complete... Abort image migration: 3% complete... Abort image migration: 4% complete... Abort image migration: 6% complete... Abort image migration: 7% complete... Abort image migration: 9% complete... Abort image migration: 10% complete... Abort image migration: 12% complete... Abort image migration: 14% complete... Abort image migration: 15% complete... Abort image migration: 17% complete... Abort image migration: 18% complete... Abort image migration: 20% complete... Abort image migration: 21% complete... Abort image migration: 23% complete... Abort image migration: 25% complete... Abort image migration: 26% complete... Abort image migration: 28% complete... Abort image migration: 29% complete... Abort image migration: 31% complete... Abort image migration: 32% complete... Abort image migration: 34% complete... Abort image migration: 35% complete... Abort image migration: 37% complete... Abort image migration: 39% complete... Abort image migration: 40% complete... Abort image migration: 42% complete... Abort image migration: 43% complete... Abort image migration: 45% complete... Abort image migration: 46% complete... Abort image migration: 48% complete... Abort image migration: 50% complete... Abort image migration: 51% complete... Abort image migration: 53% complete... Abort image migration: 54% complete... Abort image migration: 56% complete... Abort image migration: 57% complete... Abort image migration: 59% complete... Abort image migration: 60% complete... Abort image migration: 62% complete... Abort image migration: 64% complete... Abort image migration: 65% complete... Abort image migration: 67% complete... Abort image migration: 68% complete... Abort image migration: 70% complete... Abort image migration: 71% complete... Abort image migration: 73% complete... Abort image migration: 75% complete... Abort image migration: 76% complete... Abort image migration: 78% complete... Abort image migration: 79% complete... Abort image migration: 81% complete... Abort image migration: 82% complete... Abort image migration: 84% complete... Abort image migration: 85% complete... Abort image migration: 87% complete... Abort image migration: 89% complete... Abort image migration: 90% complete... Abort image migration: 92% complete... Abort image migration: 93% complete... Abort image migration: 95% complete... Abort image migration: 96% complete... Abort image migration: 98% complete...2026-03-23T18:17:40.518+0000 7f56b6a3f640 0 -- 192.168.123.104:0/3374162916 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7f5690006f70 msgr2=0x7f5690005300 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:40.522 INFO:tasks.workunit.client.0.vm04.stderr: Abort image migration: 100% complete...done. 2026-03-23T18:17:40.525 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd remove test1 2026-03-23T18:17:40.598 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete...2026-03-23T18:17:40.594+0000 7f8c40c2e640 0 -- 192.168.123.104:0/3328398035 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x564d8d8887c0 msgr2=0x564d8d8a8c40 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:17:40.602 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:17:40.606 INFO:tasks.workunit.client.0.vm04.stderr:+ dd if=/dev/urandom bs=1M count=1 2026-03-23T18:17:40.606 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd --image-format 2 import - test1 2026-03-23T18:17:40.827 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:40.826+0000 7f9aa577d640 0 --2- 192.168.123.104:0/62005366 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x55a8e845f0f0 0x55a8e84518d0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 2026-03-23T18:17:40.844 INFO:tasks.workunit.client.0.vm04.stderr:1+0 records in 2026-03-23T18:17:40.844 INFO:tasks.workunit.client.0.vm04.stderr:1+0 records out 2026-03-23T18:17:40.845 INFO:tasks.workunit.client.0.vm04.stderr:1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.236878 s, 4.4 MB/s 2026-03-23T18:17:40.867 INFO:tasks.workunit.client.0.vm04.stderr: Importing image: 100% complete...done. 2026-03-23T18:17:40.872 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd export test1 - 2026-03-23T18:17:40.872 INFO:tasks.workunit.client.0.vm04.stderr:++ md5sum 2026-03-23T18:17:40.903 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 100% complete...done. 2026-03-23T18:17:40.907 INFO:tasks.workunit.client.0.vm04.stderr:+ md5sum='d45c7b4b4d2f9776943df70f262ec422 -' 2026-03-23T18:17:40.907 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create test1@snap1 2026-03-23T18:17:41.175 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-23T18:17:41.181 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap protect test1@snap1 2026-03-23T18:17:41.211 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create test1@snap2 2026-03-23T18:17:42.180 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-23T18:17:42.187 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone test1@snap1 clone_v1 --rbd_default_clone_format=1 2026-03-23T18:17:42.243 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone test1@snap2 clone_v2 --rbd_default_clone_format=2 2026-03-23T18:17:42.290 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info clone_v1 2026-03-23T18:17:42.290 INFO:tasks.workunit.client.0.vm04.stderr:+ fgrep 'parent: rbd/test1@snap1' 2026-03-23T18:17:42.320 INFO:tasks.workunit.client.0.vm04.stdout: parent: rbd/test1@snap1 2026-03-23T18:17:42.321 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info clone_v2 2026-03-23T18:17:42.321 INFO:tasks.workunit.client.0.vm04.stderr:+ fgrep 'parent: rbd/test1@snap2' 2026-03-23T18:17:42.351 INFO:tasks.workunit.client.0.vm04.stdout: parent: rbd/test1@snap2 2026-03-23T18:17:42.352 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info clone_v2 2026-03-23T18:17:42.352 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'op_features: clone-child' 2026-03-23T18:17:42.385 INFO:tasks.workunit.client.0.vm04.stdout: op_features: clone-child 2026-03-23T18:17:42.385 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd export clone_v1 - 2026-03-23T18:17:42.385 INFO:tasks.workunit.client.0.vm04.stderr:++ md5sum 2026-03-23T18:17:42.417 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 100% complete...done. 2026-03-23T18:17:42.421 INFO:tasks.workunit.client.0.vm04.stderr:+ test 'd45c7b4b4d2f9776943df70f262ec422 -' = 'd45c7b4b4d2f9776943df70f262ec422 -' 2026-03-23T18:17:42.421 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd export clone_v2 - 2026-03-23T18:17:42.422 INFO:tasks.workunit.client.0.vm04.stderr:++ md5sum 2026-03-23T18:17:42.452 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 100% complete...done. 2026-03-23T18:17:42.456 INFO:tasks.workunit.client.0.vm04.stderr:+ test 'd45c7b4b4d2f9776943df70f262ec422 -' = 'd45c7b4b4d2f9776943df70f262ec422 -' 2026-03-23T18:17:42.456 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd children test1@snap1 2026-03-23T18:17:42.491 INFO:tasks.workunit.client.0.vm04.stderr:+ test rbd/clone_v1 = rbd/clone_v1 2026-03-23T18:17:42.491 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd children test1@snap2 2026-03-23T18:17:42.521 INFO:tasks.workunit.client.0.vm04.stderr:+ test rbd/clone_v2 = rbd/clone_v2 2026-03-23T18:17:42.521 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration prepare test1 rbd2/test2 2026-03-23T18:17:42.583 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:42.582+0000 7fbeb4a28640 0 -- 192.168.123.104:0/3635470065 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x7fbe90004a30 msgr2=0x7fbe90024e10 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:42.589 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:42.586+0000 7fbeb4a28640 0 -- 192.168.123.104:0/3635470065 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x561e10285690 msgr2=0x7fbe9407d740 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:44.226 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info clone_v1 2026-03-23T18:17:44.226 INFO:tasks.workunit.client.0.vm04.stderr:+ fgrep 'parent: rbd2/test2@snap1' 2026-03-23T18:17:44.271 INFO:tasks.workunit.client.0.vm04.stdout: parent: rbd2/test2@snap1 2026-03-23T18:17:44.272 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info clone_v2 2026-03-23T18:17:44.272 INFO:tasks.workunit.client.0.vm04.stderr:+ fgrep 'parent: rbd2/test2@snap2' 2026-03-23T18:17:44.317 INFO:tasks.workunit.client.0.vm04.stdout: parent: rbd2/test2@snap2 2026-03-23T18:17:44.317 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info clone_v2 2026-03-23T18:17:44.318 INFO:tasks.workunit.client.0.vm04.stderr:+ fgrep 'op_features: clone-child' 2026-03-23T18:17:44.363 INFO:tasks.workunit.client.0.vm04.stdout: op_features: clone-child 2026-03-23T18:17:44.364 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd children rbd2/test2@snap1 2026-03-23T18:17:44.410 INFO:tasks.workunit.client.0.vm04.stderr:+ test rbd/clone_v1 = rbd/clone_v1 2026-03-23T18:17:44.410 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd children rbd2/test2@snap2 2026-03-23T18:17:44.453 INFO:tasks.workunit.client.0.vm04.stderr:+ test rbd/clone_v2 = rbd/clone_v2 2026-03-23T18:17:44.453 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration execute test1 2026-03-23T18:17:44.632 INFO:tasks.workunit.client.0.vm04.stderr: Image migration: 100% complete...done. 2026-03-23T18:17:44.639 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd migration commit test1 2026-03-23T18:17:44.639 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration commit test1 2026-03-23T18:17:44.743 INFO:tasks.workunit.client.0.vm04.stderr:rbd: the image has descendants: 2026-03-23T18:17:44.743 INFO:tasks.workunit.client.0.vm04.stderr: rbd/clone_v1 2026-03-23T18:17:44.743 INFO:tasks.workunit.client.0.vm04.stderr: rbd/clone_v2 2026-03-23T18:17:44.743 INFO:tasks.workunit.client.0.vm04.stderr:Warning: in-use, read-only descendant images will not detect the parent update. 2026-03-23T18:17:44.743 INFO:tasks.workunit.client.0.vm04.stderr:Ensure no descendant images are opened read-only and run again with force flag. 2026-03-23T18:17:44.747 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:17:44.747 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration commit test1 --force 2026-03-23T18:17:44.863 INFO:tasks.workunit.client.0.vm04.stderr:rbd: the image has descendants: 2026-03-23T18:17:44.863 INFO:tasks.workunit.client.0.vm04.stderr: rbd/clone_v1 2026-03-23T18:17:44.863 INFO:tasks.workunit.client.0.vm04.stderr: rbd/clone_v2 2026-03-23T18:17:44.863 INFO:tasks.workunit.client.0.vm04.stderr:Warning: in-use, read-only descendant images will not detect the parent update. 2026-03-23T18:17:44.863 INFO:tasks.workunit.client.0.vm04.stderr:Proceeding anyway due to force flag set. 2026-03-23T18:17:44.869 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:44.866+0000 7fbded9f6640 0 -- 192.168.123.104:0/4269679948 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55a83d3fa7c0 msgr2=0x55a83d48d2d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:44.879 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:44.878+0000 7fbde7fff640 0 -- 192.168.123.104:0/4269679948 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x7fbdc4004cf0 msgr2=0x7fbdc40250d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:17:46.081 INFO:tasks.workunit.client.0.vm04.stderr: Commit image migration: 100% complete...done. 2026-03-23T18:17:46.086 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd export clone_v1 - 2026-03-23T18:17:46.086 INFO:tasks.workunit.client.0.vm04.stderr:++ md5sum 2026-03-23T18:17:46.121 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 100% complete...done. 2026-03-23T18:17:46.126 INFO:tasks.workunit.client.0.vm04.stderr:+ test 'd45c7b4b4d2f9776943df70f262ec422 -' = 'd45c7b4b4d2f9776943df70f262ec422 -' 2026-03-23T18:17:46.126 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd export clone_v2 - 2026-03-23T18:17:46.126 INFO:tasks.workunit.client.0.vm04.stderr:++ md5sum 2026-03-23T18:17:46.157 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 100% complete...done. 2026-03-23T18:17:46.161 INFO:tasks.workunit.client.0.vm04.stderr:+ test 'd45c7b4b4d2f9776943df70f262ec422 -' = 'd45c7b4b4d2f9776943df70f262ec422 -' 2026-03-23T18:17:46.161 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration prepare rbd2/test2 test1 2026-03-23T18:17:46.330 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:46.326+0000 7fefca26e640 0 -- 192.168.123.104:0/2431434453 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x556088444690 msgr2=0x5560884d3eb0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:47.190 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:47.186+0000 7fefca26e640 0 -- 192.168.123.104:0/2431434453 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x7fefa805ce00 msgr2=0x7fefa807d200 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:17:48.376 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info clone_v1 2026-03-23T18:17:48.376 INFO:tasks.workunit.client.0.vm04.stderr:+ fgrep 'parent: rbd/test1@snap1' 2026-03-23T18:17:48.430 INFO:tasks.workunit.client.0.vm04.stdout: parent: rbd/test1@snap1 2026-03-23T18:17:48.430 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info clone_v2 2026-03-23T18:17:48.430 INFO:tasks.workunit.client.0.vm04.stderr:+ fgrep 'parent: rbd/test1@snap2' 2026-03-23T18:17:48.487 INFO:tasks.workunit.client.0.vm04.stdout: parent: rbd/test1@snap2 2026-03-23T18:17:48.488 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info clone_v2 2026-03-23T18:17:48.488 INFO:tasks.workunit.client.0.vm04.stderr:+ fgrep 'op_features: clone-child' 2026-03-23T18:17:48.534 INFO:tasks.workunit.client.0.vm04.stdout: op_features: clone-child 2026-03-23T18:17:48.534 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd children test1@snap1 2026-03-23T18:17:48.593 INFO:tasks.workunit.client.0.vm04.stderr:+ test rbd/clone_v1 = rbd/clone_v1 2026-03-23T18:17:48.594 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd children test1@snap2 2026-03-23T18:17:48.637 INFO:tasks.workunit.client.0.vm04.stderr:+ test rbd/clone_v2 = rbd/clone_v2 2026-03-23T18:17:48.637 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration execute test1 2026-03-23T18:17:48.803 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:48.802+0000 7f4391c29640 0 -- 192.168.123.104:0/1183031726 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x7f4374008d30 msgr2=0x7f43740291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:48.804 INFO:tasks.workunit.client.0.vm04.stderr: Image migration: 100% complete...done. 2026-03-23T18:17:48.808 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd migration commit test1 2026-03-23T18:17:48.808 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration commit test1 2026-03-23T18:17:48.911 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:48.910+0000 7f8ff215b640 0 -- 192.168.123.104:0/2999439007 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55c3539f2c90 msgr2=0x55c353a883d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:48.911 INFO:tasks.workunit.client.0.vm04.stderr:rbd: the image has descendants: 2026-03-23T18:17:48.911 INFO:tasks.workunit.client.0.vm04.stderr: rbd/clone_v1 2026-03-23T18:17:48.911 INFO:tasks.workunit.client.0.vm04.stderr: rbd/clone_v2 2026-03-23T18:17:48.911 INFO:tasks.workunit.client.0.vm04.stderr:Warning: in-use, read-only descendant images will not detect the parent update. 2026-03-23T18:17:48.911 INFO:tasks.workunit.client.0.vm04.stderr:Ensure no descendant images are opened read-only and run again with force flag. 2026-03-23T18:17:48.916 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:17:48.916 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration commit test1 --force 2026-03-23T18:17:48.979 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:48.978+0000 7fb74587b640 0 -- 192.168.123.104:0/521380244 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x7fb720008d30 msgr2=0x7fb7200291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:17:48.983 INFO:tasks.workunit.client.0.vm04.stderr:rbd: the image has descendants: 2026-03-23T18:17:48.987 INFO:tasks.workunit.client.0.vm04.stderr: rbd/clone_v1 2026-03-23T18:17:48.988 INFO:tasks.workunit.client.0.vm04.stderr: rbd/clone_v2 2026-03-23T18:17:48.988 INFO:tasks.workunit.client.0.vm04.stderr:Warning: in-use, read-only descendant images will not detect the parent update. 2026-03-23T18:17:48.988 INFO:tasks.workunit.client.0.vm04.stderr:Proceeding anyway due to force flag set. 2026-03-23T18:17:48.988 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:48.982+0000 7fb74587b640 0 -- 192.168.123.104:0/521380244 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x7fb72c07d7a0 msgr2=0x7fb72c09dba0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:50.251 INFO:tasks.workunit.client.0.vm04.stderr: Commit image migration: 100% complete...done. 2026-03-23T18:17:50.256 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd export clone_v1 - 2026-03-23T18:17:50.256 INFO:tasks.workunit.client.0.vm04.stderr:++ md5sum 2026-03-23T18:17:50.290 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 100% complete...done. 2026-03-23T18:17:50.295 INFO:tasks.workunit.client.0.vm04.stderr:+ test 'd45c7b4b4d2f9776943df70f262ec422 -' = 'd45c7b4b4d2f9776943df70f262ec422 -' 2026-03-23T18:17:50.295 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd export clone_v2 - 2026-03-23T18:17:50.295 INFO:tasks.workunit.client.0.vm04.stderr:++ md5sum 2026-03-23T18:17:50.328 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 100% complete...done. 2026-03-23T18:17:50.332 INFO:tasks.workunit.client.0.vm04.stderr:+ test 'd45c7b4b4d2f9776943df70f262ec422 -' = 'd45c7b4b4d2f9776943df70f262ec422 -' 2026-03-23T18:17:50.333 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd remove clone_v1 2026-03-23T18:17:50.399 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:50.398+0000 7f48da4d5640 0 -- 192.168.123.104:0/2629526421 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55dd299a7b50 msgr2=0x55dd299d9650 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:50.404 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:17:50.407 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd remove clone_v2 2026-03-23T18:17:50.481 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:50.478+0000 7f7ba6894640 0 -- 192.168.123.104:0/663128855 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x56246cd6ff20 msgr2=0x56246cda16d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:50.491 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:17:50.495 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap unprotect test1@snap1 2026-03-23T18:17:50.532 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap purge test1 2026-03-23T18:17:52.245 INFO:tasks.workunit.client.0.vm04.stderr: Removing all snapshots: 50% complete... Removing all snapshots: 100% complete... Removing all snapshots: 100% complete...done. 2026-03-23T18:17:52.253 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test1 2026-03-23T18:17:52.321 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:52.318+0000 7f7f79d48640 0 -- 192.168.123.104:0/3450539510 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x555ac718cb50 msgr2=0x555ac71be700 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:52.323 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:17:52.326 INFO:tasks.workunit.client.0.vm04.stderr:+ for format in 1 2 2026-03-23T18:17:52.326 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create -s 128M --image-format 1 test2 2026-03-23T18:17:52.341 INFO:tasks.workunit.client.0.vm04.stderr:rbd: image format 1 is deprecated 2026-03-23T18:17:52.347 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:52.346+0000 7ff6c56b6200 -1 librbd: Forced V1 image creation. 2026-03-23T18:17:52.354 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration prepare test2 --data-pool rbd2 2026-03-23T18:17:52.399 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-23T18:17:52.427 INFO:tasks.workunit.client.0.vm04.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-23T18:17:52.429 INFO:tasks.workunit.client.0.vm04.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-23T18:17:52.429 INFO:tasks.workunit.client.0.vm04.stdout:elapsed: 0 ops: 1 ops/sec: inf bytes/sec: 0 B/s 2026-03-23T18:17:52.434 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration abort test2 2026-03-23T18:17:52.480 INFO:tasks.workunit.client.0.vm04.stderr: Abort image migration: 1% complete... Abort image migration: 3% complete... Abort image migration: 4% complete... Abort image migration: 6% complete... Abort image migration: 7% complete... Abort image migration: 9% complete... Abort image migration: 10% complete... Abort image migration: 12% complete... Abort image migration: 14% complete... Abort image migration: 15% complete... Abort image migration: 17% complete... Abort image migration: 18% complete... Abort image migration: 20% complete... Abort image migration: 21% complete... Abort image migration: 23% complete... Abort image migration: 25% complete... Abort image migration: 26% complete... Abort image migration: 28% complete... Abort image migration: 29% complete... Abort image migration: 31% complete... Abort image migration: 32% complete... Abort image migration: 34% complete... Abort image migration: 35% complete... Abort image migration: 37% complete... Abort image migration: 39% complete... Abort image migration: 40% complete... Abort image migration: 42% complete... Abort image migration: 43% complete... Abort image migration: 45% complete... Abort image migration: 46% complete... Abort image migration: 48% complete... Abort image migration: 50% complete... Abort image migration: 51% complete... Abort image migration: 53% complete... Abort image migration: 54% complete... Abort image migration: 56% complete... Abort image migration: 57% complete... Abort image migration: 59% complete... Abort image migration: 60% complete... Abort image migration: 62% complete... Abort image migration: 64% complete... Abort image migration: 65% complete... Abort image migration: 67% complete... Abort image migration: 68% complete... Abort image migration: 70% complete... Abort image migration: 71% complete... Abort image migration: 73% complete... Abort image migration: 75% complete... Abort image migration: 76% complete... Abort image migration: 78% complete... Abort image migration: 79% complete... Abort image migration: 81% complete... Abort image migration: 82% complete... Abort image migration: 84% complete... Abort image migration: 85% complete... Abort image migration: 87% complete... Abort image migration: 89% complete... Abort image migration: 90% complete... Abort image migration: 92% complete... Abort image migration: 93% complete... Abort image migration: 95% complete... Abort image migration: 96% complete... Abort image migration: 98% complete...2026-03-23T18:17:52.478+0000 7ff34d9f6640 0 -- 192.168.123.104:0/3567997678 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7ff330008d30 msgr2=0x7ff3300291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:17:52.493 INFO:tasks.workunit.client.0.vm04.stderr: Abort image migration: 100% complete...done. 2026-03-23T18:17:52.497 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-23T18:17:52.522 INFO:tasks.workunit.client.0.vm04.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-23T18:17:52.524 INFO:tasks.workunit.client.0.vm04.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-23T18:17:52.524 INFO:tasks.workunit.client.0.vm04.stdout:elapsed: 0 ops: 1 ops/sec: 250 bytes/sec: 250 KiB/s 2026-03-23T18:17:52.528 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test2 2026-03-23T18:17:52.769 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete... Removing image: 100% complete...done. 2026-03-23T18:17:52.773 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create -s 128M --image-format 1 test2 2026-03-23T18:17:52.788 INFO:tasks.workunit.client.0.vm04.stderr:rbd: image format 1 is deprecated 2026-03-23T18:17:52.794 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:52.790+0000 7f2332199200 -1 librbd: Forced V1 image creation. 2026-03-23T18:17:52.801 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration prepare test2 --data-pool rbd2 2026-03-23T18:17:52.850 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-23T18:17:52.879 INFO:tasks.workunit.client.0.vm04.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-23T18:17:52.881 INFO:tasks.workunit.client.0.vm04.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-23T18:17:52.881 INFO:tasks.workunit.client.0.vm04.stdout:elapsed: 0 ops: 1 ops/sec: inf bytes/sec: 0 B/s 2026-03-23T18:17:52.886 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration execute test2 2026-03-23T18:17:52.937 INFO:tasks.workunit.client.0.vm04.stderr: Image migration: 3% complete... Image migration: 6% complete... Image migration: 9% complete... Image migration: 12% complete... Image migration: 15% complete... Image migration: 18% complete... Image migration: 21% complete... Image migration: 25% complete... Image migration: 28% complete... Image migration: 31% complete... Image migration: 34% complete... Image migration: 37% complete... Image migration: 40% complete... Image migration: 43% complete... Image migration: 46% complete... Image migration: 50% complete... Image migration: 53% complete... Image migration: 56% complete... Image migration: 59% complete... Image migration: 62% complete... Image migration: 65% complete... Image migration: 68% complete... Image migration: 71% complete... Image migration: 75% complete... Image migration: 78% complete... Image migration: 81% complete... Image migration: 84% complete... Image migration: 87% complete... Image migration: 90% complete... Image migration: 93% complete... Image migration: 96% complete...2026-03-23T18:17:52.934+0000 7f7d51471640 0 -- 192.168.123.104:0/2970389503 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7f7d28008d30 msgr2=0x7f7d280291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:52.940 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:52.938+0000 7f7d526fa640 0 -- 192.168.123.104:0/2970389503 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x56114482d5b0 msgr2=0x5611448bf4e0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:17:52.945 INFO:tasks.workunit.client.0.vm04.stderr: Image migration: 100% complete...done. 2026-03-23T18:17:52.948 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration abort test2 2026-03-23T18:17:52.996 INFO:tasks.workunit.client.0.vm04.stderr: Abort image migration: 1% complete... Abort image migration: 3% complete... Abort image migration: 4% complete... Abort image migration: 6% complete... Abort image migration: 7% complete... Abort image migration: 9% complete... Abort image migration: 10% complete... Abort image migration: 12% complete... Abort image migration: 14% complete... Abort image migration: 15% complete... Abort image migration: 17% complete... Abort image migration: 18% complete... Abort image migration: 20% complete... Abort image migration: 21% complete... Abort image migration: 23% complete... Abort image migration: 25% complete... Abort image migration: 26% complete... Abort image migration: 28% complete... Abort image migration: 29% complete... Abort image migration: 31% complete... Abort image migration: 32% complete... Abort image migration: 34% complete... Abort image migration: 35% complete... Abort image migration: 37% complete... Abort image migration: 39% complete... Abort image migration: 40% complete... Abort image migration: 42% complete... Abort image migration: 43% complete... Abort image migration: 45% complete... Abort image migration: 46% complete... Abort image migration: 48% complete... Abort image migration: 50% complete... Abort image migration: 51% complete... Abort image migration: 53% complete... Abort image migration: 54% complete... Abort image migration: 56% complete... Abort image migration: 57% complete... Abort image migration: 59% complete... Abort image migration: 60% complete... Abort image migration: 62% complete... Abort image migration: 64% complete... Abort image migration: 65% complete... Abort image migration: 67% complete... Abort image migration: 68% complete... Abort image migration: 70% complete... Abort image migration: 71% complete... Abort image migration: 73% complete... Abort image migration: 75% complete... Abort image migration: 76% complete... Abort image migration: 78% complete... Abort image migration: 79% complete... Abort image migration: 81% complete... Abort image migration: 82% complete... Abort image migration: 84% complete... Abort image migration: 85% complete... Abort image migration: 87% complete... Abort image migration: 89% complete... Abort image migration: 90% complete... Abort image migration: 92% complete... Abort image migration: 93% complete... Abort image migration: 95% complete... Abort image migration: 96% complete... Abort image migration: 98% complete...2026-03-23T18:17:52.994+0000 7f420beb4640 0 -- 192.168.123.104:0/4013216717 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7f41e4008d30 msgr2=0x7f41e40291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:17:53.009 INFO:tasks.workunit.client.0.vm04.stderr: Abort image migration: 100% complete...done. 2026-03-23T18:17:53.012 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-23T18:17:53.038 INFO:tasks.workunit.client.0.vm04.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-23T18:17:53.040 INFO:tasks.workunit.client.0.vm04.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-23T18:17:53.040 INFO:tasks.workunit.client.0.vm04.stdout:elapsed: 0 ops: 1 ops/sec: 250 bytes/sec: 250 KiB/s 2026-03-23T18:17:53.044 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test2 2026-03-23T18:17:53.084 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete... Removing image: 100% complete...done. 2026-03-23T18:17:53.087 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create -s 128M --image-format 1 test2 2026-03-23T18:17:53.102 INFO:tasks.workunit.client.0.vm04.stderr:rbd: image format 1 is deprecated 2026-03-23T18:17:53.108 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:53.106+0000 7f6c8b331200 -1 librbd: Forced V1 image creation. 2026-03-23T18:17:53.115 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration prepare test2 --data-pool INVALID_DATA_POOL 2026-03-23T18:17:53.145 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:53.142+0000 7ffa99653200 -1 librbd::image::CreateRequest: 0x56423b917740 validate_data_pool: data pool INVALID_DATA_POOL does not exist 2026-03-23T18:17:53.145 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:53.142+0000 7ffa99653200 -1 librbd::Migration: create_dst_image: header creation failed: (2) No such file or directory 2026-03-23T18:17:53.150 INFO:tasks.workunit.client.0.vm04.stderr:rbd: preparing migration failed: (2) No such file or directory 2026-03-23T18:17:53.153 INFO:tasks.workunit.client.0.vm04.stderr:+ true 2026-03-23T18:17:53.153 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-23T18:17:53.177 INFO:tasks.workunit.client.0.vm04.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-23T18:17:53.180 INFO:tasks.workunit.client.0.vm04.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-23T18:17:53.180 INFO:tasks.workunit.client.0.vm04.stdout:elapsed: 0 ops: 1 ops/sec: 250 bytes/sec: 250 KiB/s 2026-03-23T18:17:53.184 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test2 2026-03-23T18:17:53.225 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete... Removing image: 100% complete...done. 2026-03-23T18:17:53.228 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create -s 128M --image-format 1 test2 2026-03-23T18:17:53.243 INFO:tasks.workunit.client.0.vm04.stderr:rbd: image format 1 is deprecated 2026-03-23T18:17:53.248 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:53.246+0000 7fbe03b7e200 -1 librbd: Forced V1 image creation. 2026-03-23T18:17:53.255 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration prepare test2 rbd2/test2 2026-03-23T18:17:53.299 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 rbd2/test2 2026-03-23T18:17:53.327 INFO:tasks.workunit.client.0.vm04.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-23T18:17:53.330 INFO:tasks.workunit.client.0.vm04.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-23T18:17:53.330 INFO:tasks.workunit.client.0.vm04.stdout:elapsed: 0 ops: 1 ops/sec: inf bytes/sec: 0 B/s 2026-03-23T18:17:53.336 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration abort test2 2026-03-23T18:17:53.389 INFO:tasks.workunit.client.0.vm04.stderr: Abort image migration: 1% complete... Abort image migration: 3% complete... Abort image migration: 4% complete... Abort image migration: 6% complete... Abort image migration: 7% complete... Abort image migration: 9% complete... Abort image migration: 10% complete... Abort image migration: 12% complete... Abort image migration: 14% complete... Abort image migration: 15% complete... Abort image migration: 17% complete... Abort image migration: 18% complete... Abort image migration: 20% complete... Abort image migration: 21% complete... Abort image migration: 23% complete... Abort image migration: 25% complete... Abort image migration: 26% complete... Abort image migration: 28% complete... Abort image migration: 29% complete... Abort image migration: 31% complete... Abort image migration: 32% complete... Abort image migration: 34% complete... Abort image migration: 35% complete... Abort image migration: 37% complete... Abort image migration: 39% complete... Abort image migration: 40% complete... Abort image migration: 42% complete... Abort image migration: 43% complete... Abort image migration: 45% complete... Abort image migration: 46% complete... Abort image migration: 48% complete... Abort image migration: 50% complete... Abort image migration: 51% complete... Abort image migration: 53% complete... Abort image migration: 54% complete... Abort image migration: 56% complete... Abort image migration: 57% complete... Abort image migration: 59% complete... Abort image migration: 60% complete... Abort image migration: 62% complete... Abort image migration: 64% complete... Abort image migration: 65% complete... Abort image migration: 67% complete... Abort image migration: 68% complete... Abort image migration: 70% complete... Abort image migration: 71% complete... Abort image migration: 73% complete... Abort image migration: 75% complete... Abort image migration: 76% complete... Abort image migration: 78% complete... Abort image migration: 79% complete... Abort image migration: 81% complete... Abort image migration: 82% complete... Abort image migration: 84% complete... Abort image migration: 85% complete... Abort image migration: 87% complete... Abort image migration: 89% complete... Abort image migration: 90% complete... Abort image migration: 92% complete... Abort image migration: 93% complete... Abort image migration: 95% complete... Abort image migration: 96% complete... Abort image migration: 98% complete...2026-03-23T18:17:53.386+0000 7f819914c640 0 -- 192.168.123.104:0/4133425412 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x556d84eea5b0 msgr2=0x556d84f7c620 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:53.391 INFO:tasks.workunit.client.0.vm04.stderr: Abort image migration: 100% complete...done. 2026-03-23T18:17:53.394 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-23T18:17:53.418 INFO:tasks.workunit.client.0.vm04.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-23T18:17:53.421 INFO:tasks.workunit.client.0.vm04.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-23T18:17:53.421 INFO:tasks.workunit.client.0.vm04.stdout:elapsed: 0 ops: 1 ops/sec: inf bytes/sec: 0 B/s 2026-03-23T18:17:53.425 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test2 2026-03-23T18:17:53.461 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete... Removing image: 100% complete...done. 2026-03-23T18:17:53.465 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create -s 128M --image-format 1 test2 2026-03-23T18:17:53.479 INFO:tasks.workunit.client.0.vm04.stderr:rbd: image format 1 is deprecated 2026-03-23T18:17:53.485 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:53.482+0000 7f33456cd200 -1 librbd: Forced V1 image creation. 2026-03-23T18:17:53.491 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration prepare test2 rbd2/test2 2026-03-23T18:17:53.535 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration abort rbd2/test2 2026-03-23T18:17:53.586 INFO:tasks.workunit.client.0.vm04.stderr: Abort image migration: 1% complete... Abort image migration: 3% complete... Abort image migration: 4% complete... Abort image migration: 6% complete... Abort image migration: 7% complete... Abort image migration: 9% complete... Abort image migration: 10% complete... Abort image migration: 12% complete... Abort image migration: 14% complete... Abort image migration: 15% complete... Abort image migration: 17% complete... Abort image migration: 18% complete... Abort image migration: 20% complete... Abort image migration: 21% complete... Abort image migration: 23% complete... Abort image migration: 25% complete... Abort image migration: 26% complete... Abort image migration: 28% complete... Abort image migration: 29% complete... Abort image migration: 31% complete... Abort image migration: 32% complete... Abort image migration: 34% complete... Abort image migration: 35% complete... Abort image migration: 37% complete... Abort image migration: 39% complete... Abort image migration: 40% complete... Abort image migration: 42% complete... Abort image migration: 43% complete... Abort image migration: 45% complete... Abort image migration: 46% complete... Abort image migration: 48% complete... Abort image migration: 50% complete... Abort image migration: 51% complete... Abort image migration: 53% complete... Abort image migration: 54% complete... Abort image migration: 56% complete... Abort image migration: 57% complete... Abort image migration: 59% complete... Abort image migration: 60% complete... Abort image migration: 62% complete... Abort image migration: 64% complete... Abort image migration: 65% complete... Abort image migration: 67% complete... Abort image migration: 68% complete... Abort image migration: 70% complete... Abort image migration: 71% complete... Abort image migration: 73% complete... Abort image migration: 75% complete... Abort image migration: 76% complete... Abort image migration: 78% complete... Abort image migration: 79% complete... Abort image migration: 81% complete... Abort image migration: 82% complete... Abort image migration: 84% complete... Abort image migration: 85% complete... Abort image migration: 87% complete... Abort image migration: 89% complete... Abort image migration: 90% complete... Abort image migration: 92% complete... Abort image migration: 93% complete... Abort image migration: 95% complete... Abort image migration: 96% complete... Abort image migration: 98% complete...2026-03-23T18:17:53.582+0000 7f59ce751640 0 -- 192.168.123.104:0/640411642 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7f59b0008d30 msgr2=0x7f59b00291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:53.594 INFO:tasks.workunit.client.0.vm04.stderr: Abort image migration: 100% complete...done. 2026-03-23T18:17:53.598 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-23T18:17:53.623 INFO:tasks.workunit.client.0.vm04.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-23T18:17:53.625 INFO:tasks.workunit.client.0.vm04.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-23T18:17:53.625 INFO:tasks.workunit.client.0.vm04.stdout:elapsed: 0 ops: 1 ops/sec: inf bytes/sec: 0 B/s 2026-03-23T18:17:53.629 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test2 2026-03-23T18:17:53.667 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete... Removing image: 100% complete...done. 2026-03-23T18:17:53.670 INFO:tasks.workunit.client.0.vm04.stderr:+ test 1 = 1 2026-03-23T18:17:53.670 INFO:tasks.workunit.client.0.vm04.stderr:+ continue 2026-03-23T18:17:53.670 INFO:tasks.workunit.client.0.vm04.stderr:+ for format in 1 2 2026-03-23T18:17:53.670 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create -s 128M --image-format 2 test2 2026-03-23T18:17:53.703 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration prepare test2 --data-pool rbd2 2026-03-23T18:17:53.766 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:53.762+0000 7fc829230640 0 -- 192.168.123.104:0/352418110 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x5653f8b4a340 msgr2=0x5653f8bdc1d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:53.772 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-23T18:17:53.804 INFO:tasks.workunit.client.0.vm04.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-23T18:17:53.807 INFO:tasks.workunit.client.0.vm04.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-23T18:17:53.807 INFO:tasks.workunit.client.0.vm04.stdout:elapsed: 0 ops: 1 ops/sec: 250 bytes/sec: 250 KiB/s 2026-03-23T18:17:53.813 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration abort test2 2026-03-23T18:17:53.877 INFO:tasks.workunit.client.0.vm04.stderr: Abort image migration: 1% complete... Abort image migration: 3% complete... Abort image migration: 4% complete... Abort image migration: 6% complete... Abort image migration: 7% complete... Abort image migration: 9% complete... Abort image migration: 10% complete... Abort image migration: 12% complete... Abort image migration: 14% complete... Abort image migration: 15% complete... Abort image migration: 17% complete... Abort image migration: 18% complete... Abort image migration: 20% complete... Abort image migration: 21% complete... Abort image migration: 23% complete... Abort image migration: 25% complete... Abort image migration: 26% complete... Abort image migration: 28% complete... Abort image migration: 29% complete... Abort image migration: 31% complete... Abort image migration: 32% complete... Abort image migration: 34% complete... Abort image migration: 35% complete... Abort image migration: 37% complete... Abort image migration: 39% complete... Abort image migration: 40% complete... Abort image migration: 42% complete... Abort image migration: 43% complete... Abort image migration: 45% complete... Abort image migration: 46% complete... Abort image migration: 48% complete... Abort image migration: 50% complete... Abort image migration: 51% complete... Abort image migration: 53% complete... Abort image migration: 54% complete... Abort image migration: 56% complete... Abort image migration: 57% complete... Abort image migration: 59% complete... Abort image migration: 60% complete... Abort image migration: 62% complete... Abort image migration: 64% complete... Abort image migration: 65% complete... Abort image migration: 67% complete... Abort image migration: 68% complete... Abort image migration: 70% complete... Abort image migration: 71% complete... Abort image migration: 73% complete... Abort image migration: 75% complete... Abort image migration: 76% complete... Abort image migration: 78% complete... Abort image migration: 79% complete... Abort image migration: 81% complete... Abort image migration: 82% complete... Abort image migration: 84% complete... Abort image migration: 85% complete... Abort image migration: 87% complete... Abort image migration: 89% complete... Abort image migration: 90% complete... Abort image migration: 92% complete... Abort image migration: 93% complete... Abort image migration: 95% complete... Abort image migration: 96% complete... Abort image migration: 98% complete...2026-03-23T18:17:53.874+0000 7f9a1ab87640 0 -- 192.168.123.104:0/2867735129 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x560da684f8d0 msgr2=0x560da686fcb0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:53.882 INFO:tasks.workunit.client.0.vm04.stderr: Abort image migration: 100% complete...done. 2026-03-23T18:17:53.885 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-23T18:17:53.917 INFO:tasks.workunit.client.0.vm04.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-23T18:17:53.919 INFO:tasks.workunit.client.0.vm04.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-23T18:17:53.919 INFO:tasks.workunit.client.0.vm04.stdout:elapsed: 0 ops: 1 ops/sec: 250 bytes/sec: 250 KiB/s 2026-03-23T18:17:53.928 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test2 2026-03-23T18:17:53.998 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete...2026-03-23T18:17:53.994+0000 7f4502c32640 0 -- 192.168.123.104:0/1202633744 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x562c06157b50 msgr2=0x562c061896b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:54.004 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:17:54.009 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create -s 128M --image-format 2 test2 2026-03-23T18:17:54.045 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration prepare test2 --data-pool rbd2 2026-03-23T18:17:54.114 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:54.110+0000 7f2846c9e640 0 -- 192.168.123.104:0/63015538 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x55d731e303b0 msgr2=0x55d731e9ef50 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:54.121 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-23T18:17:54.157 INFO:tasks.workunit.client.0.vm04.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-23T18:17:54.160 INFO:tasks.workunit.client.0.vm04.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-23T18:17:54.160 INFO:tasks.workunit.client.0.vm04.stdout:elapsed: 0 ops: 1 ops/sec: 250 bytes/sec: 250 KiB/s 2026-03-23T18:17:54.168 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration execute test2 2026-03-23T18:17:54.221 INFO:tasks.workunit.client.0.vm04.stderr: Image migration: 3% complete... Image migration: 6% complete... Image migration: 9% complete... Image migration: 12% complete... Image migration: 15% complete... Image migration: 18% complete... Image migration: 21% complete... Image migration: 25% complete... Image migration: 28% complete... Image migration: 31% complete... Image migration: 34% complete... Image migration: 37% complete... Image migration: 40% complete... Image migration: 43% complete... Image migration: 46% complete... Image migration: 50% complete... Image migration: 53% complete... Image migration: 56% complete... Image migration: 59% complete... Image migration: 62% complete... Image migration: 65% complete... Image migration: 68% complete... Image migration: 71% complete... Image migration: 75% complete... Image migration: 78% complete... Image migration: 81% complete... Image migration: 84% complete... Image migration: 87% complete... Image migration: 90% complete... Image migration: 93% complete... Image migration: 96% complete...2026-03-23T18:17:54.218+0000 7fe3a2ba6640 0 -- 192.168.123.104:0/243614568 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x7fe38405ce60 msgr2=0x7fe38407d260 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:54.229 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:54.226+0000 7fe3a3e2f640 0 -- 192.168.123.104:0/243614568 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x559e02ef75b0 msgr2=0x559e02f8a110 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:54.232 INFO:tasks.workunit.client.0.vm04.stderr: Image migration: 100% complete...done. 2026-03-23T18:17:54.235 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration abort test2 2026-03-23T18:17:54.310 INFO:tasks.workunit.client.0.vm04.stderr: Abort image migration: 1% complete... Abort image migration: 3% complete... Abort image migration: 4% complete... Abort image migration: 6% complete... Abort image migration: 7% complete... Abort image migration: 9% complete... Abort image migration: 10% complete... Abort image migration: 12% complete... Abort image migration: 14% complete... Abort image migration: 15% complete... Abort image migration: 17% complete... Abort image migration: 18% complete... Abort image migration: 20% complete... Abort image migration: 21% complete... Abort image migration: 23% complete... Abort image migration: 25% complete... Abort image migration: 26% complete... Abort image migration: 28% complete... Abort image migration: 29% complete... Abort image migration: 31% complete... Abort image migration: 32% complete... Abort image migration: 34% complete... Abort image migration: 35% complete... Abort image migration: 37% complete... Abort image migration: 39% complete... Abort image migration: 40% complete... Abort image migration: 42% complete... Abort image migration: 43% complete... Abort image migration: 45% complete... Abort image migration: 46% complete... Abort image migration: 48% complete... Abort image migration: 50% complete... Abort image migration: 51% complete... Abort image migration: 53% complete... Abort image migration: 54% complete... Abort image migration: 56% complete... Abort image migration: 57% complete... Abort image migration: 59% complete... Abort image migration: 60% complete... Abort image migration: 62% complete... Abort image migration: 64% complete... Abort image migration: 65% complete... Abort image migration: 67% complete... Abort image migration: 68% complete... Abort image migration: 70% complete... Abort image migration: 71% complete... Abort image migration: 73% complete... Abort image migration: 75% complete... Abort image migration: 76% complete... Abort image migration: 78% complete... Abort image migration: 79% complete... Abort image migration: 81% complete... Abort image migration: 82% complete... Abort image migration: 84% complete... Abort image migration: 85% complete... Abort image migration: 87% complete... Abort image migration: 89% complete... Abort image migration: 90% complete... Abort image migration: 92% complete... Abort image migration: 93% complete... Abort image migration: 95% complete... Abort image migration: 96% complete... Abort image migration: 98% complete...2026-03-23T18:17:54.310+0000 7fa4affff640 0 -- 192.168.123.104:0/1883962351 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x560c07cef150 msgr2=0x560c07d0f530 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:17:54.314 INFO:tasks.workunit.client.0.vm04.stderr: Abort image migration: 100% complete...done. 2026-03-23T18:17:54.319 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-23T18:17:54.348 INFO:tasks.workunit.client.0.vm04.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-23T18:17:54.351 INFO:tasks.workunit.client.0.vm04.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-23T18:17:54.351 INFO:tasks.workunit.client.0.vm04.stdout:elapsed: 0 ops: 1 ops/sec: 250 bytes/sec: 250 KiB/s 2026-03-23T18:17:54.358 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test2 2026-03-23T18:17:54.457 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete...2026-03-23T18:17:54.454+0000 7f0f01c5f640 0 -- 192.168.123.104:0/2471131825 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x5559d8df0790 msgr2=0x5559d8e10c10 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:54.463 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:17:54.468 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create -s 128M --image-format 2 test2 2026-03-23T18:17:54.505 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration prepare test2 --data-pool INVALID_DATA_POOL 2026-03-23T18:17:54.553 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:54.550+0000 7fc45fcc6200 -1 librbd::image::CreateRequest: 0x56315f2c5740 validate_data_pool: data pool INVALID_DATA_POOL does not exist 2026-03-23T18:17:54.553 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:54.550+0000 7fc45fcc6200 -1 librbd::Migration: create_dst_image: header creation failed: (2) No such file or directory 2026-03-23T18:17:54.570 INFO:tasks.workunit.client.0.vm04.stderr:rbd: preparing migration failed: (2) No such file or directory 2026-03-23T18:17:54.574 INFO:tasks.workunit.client.0.vm04.stderr:+ true 2026-03-23T18:17:54.574 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-23T18:17:54.607 INFO:tasks.workunit.client.0.vm04.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-23T18:17:54.610 INFO:tasks.workunit.client.0.vm04.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-23T18:17:54.610 INFO:tasks.workunit.client.0.vm04.stdout:elapsed: 0 ops: 1 ops/sec: inf bytes/sec: 0 B/s 2026-03-23T18:17:54.617 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test2 2026-03-23T18:17:54.687 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete...2026-03-23T18:17:54.686+0000 7ff78bfff640 0 -- 192.168.123.104:0/92251764 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7ff778008d30 msgr2=0x7ff7780291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:54.694 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:17:54.698 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create -s 128M --image-format 2 test2 2026-03-23T18:17:54.731 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration prepare test2 rbd2/test2 2026-03-23T18:17:54.796 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:54.794+0000 7ff5dbfff640 0 -- 192.168.123.104:0/1682221703 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7ff5b800fec0 msgr2=0x7ff5b8010ed0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:17:54.807 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 rbd2/test2 2026-03-23T18:17:54.843 INFO:tasks.workunit.client.0.vm04.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-23T18:17:54.846 INFO:tasks.workunit.client.0.vm04.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-23T18:17:54.846 INFO:tasks.workunit.client.0.vm04.stdout:elapsed: 0 ops: 1 ops/sec: inf bytes/sec: 0 B/s 2026-03-23T18:17:54.853 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration abort test2 2026-03-23T18:17:54.924 INFO:tasks.workunit.client.0.vm04.stderr: Abort image migration: 1% complete... Abort image migration: 3% complete... Abort image migration: 4% complete... Abort image migration: 6% complete... Abort image migration: 7% complete... Abort image migration: 9% complete... Abort image migration: 10% complete... Abort image migration: 12% complete... Abort image migration: 14% complete... Abort image migration: 15% complete... Abort image migration: 17% complete... Abort image migration: 18% complete... Abort image migration: 20% complete... Abort image migration: 21% complete... Abort image migration: 23% complete... Abort image migration: 25% complete... Abort image migration: 26% complete... Abort image migration: 28% complete... Abort image migration: 29% complete... Abort image migration: 31% complete... Abort image migration: 32% complete... Abort image migration: 34% complete... Abort image migration: 35% complete... Abort image migration: 37% complete... Abort image migration: 39% complete... Abort image migration: 40% complete... Abort image migration: 42% complete... Abort image migration: 43% complete... Abort image migration: 45% complete... Abort image migration: 46% complete... Abort image migration: 48% complete... Abort image migration: 50% complete... Abort image migration: 51% complete... Abort image migration: 53% complete... Abort image migration: 54% complete... Abort image migration: 56% complete... Abort image migration: 57% complete... Abort image migration: 59% complete... Abort image migration: 60% complete... Abort image migration: 62% complete... Abort image migration: 64% complete... Abort image migration: 65% complete... Abort image migration: 67% complete... Abort image migration: 68% complete... Abort image migration: 70% complete... Abort image migration: 71% complete... Abort image migration: 73% complete... Abort image migration: 75% complete... Abort image migration: 76% complete... Abort image migration: 78% complete... Abort image migration: 79% complete... Abort image migration: 81% complete... Abort image migration: 82% complete... Abort image migration: 84% complete... Abort image migration: 85% complete... Abort image migration: 87% complete... Abort image migration: 89% complete... Abort image migration: 90% complete... Abort image migration: 92% complete... Abort image migration: 93% complete... Abort image migration: 95% complete... Abort image migration: 96% complete... Abort image migration: 98% complete... Abort image migration: 100% complete...done. 2026-03-23T18:17:54.927 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-23T18:17:54.958 INFO:tasks.workunit.client.0.vm04.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-23T18:17:54.960 INFO:tasks.workunit.client.0.vm04.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-23T18:17:54.960 INFO:tasks.workunit.client.0.vm04.stdout:elapsed: 0 ops: 1 ops/sec: 250 bytes/sec: 250 KiB/s 2026-03-23T18:17:54.967 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test2 2026-03-23T18:17:55.050 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete...2026-03-23T18:17:55.046+0000 7fa4bf12b640 0 -- 192.168.123.104:0/3657130842 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7fa498008d30 msgr2=0x7fa4980291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:55.062 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:17:55.065 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create -s 128M --image-format 2 test2 2026-03-23T18:17:55.099 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration prepare test2 rbd2/test2 2026-03-23T18:17:55.165 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:55.162+0000 7f1f6d25e640 0 -- 192.168.123.104:0/3611824281 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7f1f50031e60 msgr2=0x7f1f50032270 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:55.173 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration abort rbd2/test2 2026-03-23T18:17:55.238 INFO:tasks.workunit.client.0.vm04.stderr: Abort image migration: 1% complete... Abort image migration: 3% complete... Abort image migration: 4% complete... Abort image migration: 6% complete... Abort image migration: 7% complete... Abort image migration: 9% complete... Abort image migration: 10% complete... Abort image migration: 12% complete... Abort image migration: 14% complete... Abort image migration: 15% complete... Abort image migration: 17% complete... Abort image migration: 18% complete... Abort image migration: 20% complete... Abort image migration: 21% complete... Abort image migration: 23% complete... Abort image migration: 25% complete... Abort image migration: 26% complete... Abort image migration: 28% complete... Abort image migration: 29% complete... Abort image migration: 31% complete... Abort image migration: 32% complete... Abort image migration: 34% complete... Abort image migration: 35% complete... Abort image migration: 37% complete... Abort image migration: 39% complete... Abort image migration: 40% complete... Abort image migration: 42% complete... Abort image migration: 43% complete... Abort image migration: 45% complete... Abort image migration: 46% complete... Abort image migration: 48% complete... Abort image migration: 50% complete... Abort image migration: 51% complete... Abort image migration: 53% complete... Abort image migration: 54% complete... Abort image migration: 56% complete... Abort image migration: 57% complete... Abort image migration: 59% complete... Abort image migration: 60% complete... Abort image migration: 62% complete... Abort image migration: 64% complete... Abort image migration: 65% complete... Abort image migration: 67% complete... Abort image migration: 68% complete... Abort image migration: 70% complete... Abort image migration: 71% complete... Abort image migration: 73% complete... Abort image migration: 75% complete... Abort image migration: 76% complete... Abort image migration: 78% complete... Abort image migration: 79% complete... Abort image migration: 81% complete... Abort image migration: 82% complete... Abort image migration: 84% complete... Abort image migration: 85% complete... Abort image migration: 87% complete... Abort image migration: 89% complete... Abort image migration: 90% complete... Abort image migration: 92% complete... Abort image migration: 93% complete... Abort image migration: 95% complete... Abort image migration: 96% complete... Abort image migration: 98% complete... Abort image migration: 100% complete...done. 2026-03-23T18:17:55.242 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-23T18:17:55.272 INFO:tasks.workunit.client.0.vm04.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-23T18:17:55.276 INFO:tasks.workunit.client.0.vm04.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-23T18:17:55.276 INFO:tasks.workunit.client.0.vm04.stdout:elapsed: 0 ops: 1 ops/sec: 250 bytes/sec: 250 KiB/s 2026-03-23T18:17:55.286 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test2 2026-03-23T18:17:55.353 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete...2026-03-23T18:17:55.350+0000 7ff488a47640 0 -- 192.168.123.104:0/2612952052 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x55994b9c6640 msgr2=0x55994b9e6ac0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:55.359 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:17:55.362 INFO:tasks.workunit.client.0.vm04.stderr:+ test 2 = 1 2026-03-23T18:17:55.362 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create -s 128M --image-format 2 test2 2026-03-23T18:17:55.397 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration prepare test2 rbd2/ns1/test3 2026-03-23T18:17:55.460 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:17:55.458+0000 7fc7b5bc2640 0 -- 192.168.123.104:0/3034919669 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x55d3eb9291f0 msgr2=0x55d3eba0d9d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:55.467 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 rbd2/ns1/test3 2026-03-23T18:17:55.499 INFO:tasks.workunit.client.0.vm04.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-23T18:17:55.502 INFO:tasks.workunit.client.0.vm04.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-23T18:17:55.503 INFO:tasks.workunit.client.0.vm04.stdout:elapsed: 0 ops: 1 ops/sec: 250 bytes/sec: 250 KiB/s 2026-03-23T18:17:55.509 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration abort test2 2026-03-23T18:17:55.574 INFO:tasks.workunit.client.0.vm04.stderr: Abort image migration: 1% complete... Abort image migration: 3% complete... Abort image migration: 4% complete... Abort image migration: 6% complete... Abort image migration: 7% complete... Abort image migration: 9% complete... Abort image migration: 10% complete... Abort image migration: 12% complete... Abort image migration: 14% complete... Abort image migration: 15% complete... Abort image migration: 17% complete... Abort image migration: 18% complete... Abort image migration: 20% complete... Abort image migration: 21% complete... Abort image migration: 23% complete... Abort image migration: 25% complete... Abort image migration: 26% complete... Abort image migration: 28% complete... Abort image migration: 29% complete... Abort image migration: 31% complete... Abort image migration: 32% complete... Abort image migration: 34% complete... Abort image migration: 35% complete... Abort image migration: 37% complete... Abort image migration: 39% complete... Abort image migration: 40% complete... Abort image migration: 42% complete... Abort image migration: 43% complete... Abort image migration: 45% complete... Abort image migration: 46% complete... Abort image migration: 48% complete... Abort image migration: 50% complete... Abort image migration: 51% complete... Abort image migration: 53% complete... Abort image migration: 54% complete... Abort image migration: 56% complete... Abort image migration: 57% complete... Abort image migration: 59% complete... Abort image migration: 60% complete... Abort image migration: 62% complete... Abort image migration: 64% complete... Abort image migration: 65% complete... Abort image migration: 67% complete... Abort image migration: 68% complete... Abort image migration: 70% complete... Abort image migration: 71% complete... Abort image migration: 73% complete... Abort image migration: 75% complete... Abort image migration: 76% complete... Abort image migration: 78% complete... Abort image migration: 79% complete... Abort image migration: 81% complete... Abort image migration: 82% complete... Abort image migration: 84% complete... Abort image migration: 85% complete... Abort image migration: 87% complete... Abort image migration: 89% complete... Abort image migration: 90% complete... Abort image migration: 92% complete... Abort image migration: 93% complete... Abort image migration: 95% complete... Abort image migration: 96% complete... Abort image migration: 98% complete... Abort image migration: 100% complete...done. 2026-03-23T18:17:55.578 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-23T18:17:55.622 INFO:tasks.workunit.client.0.vm04.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-23T18:17:55.625 INFO:tasks.workunit.client.0.vm04.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-23T18:17:55.625 INFO:tasks.workunit.client.0.vm04.stdout:elapsed: 0 ops: 1 ops/sec: 250 bytes/sec: 250 KiB/s 2026-03-23T18:17:55.633 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test2 2026-03-23T18:17:55.696 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete...2026-03-23T18:17:55.694+0000 7fadb7fff640 0 -- 192.168.123.104:0/2979268508 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x56093503e800 msgr2=0x56093505ec80 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:17:55.702 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:17:55.706 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-23T18:17:55.706 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:55.769 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:55.833 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:55.893 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:55.956 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:56.236 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:56.505 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:56.570 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:56.634 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:56.703 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:56.766 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:56.828 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:56.887 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:56.945 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:57.009 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:57.078 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:57.147 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:57.208 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:57.267 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool rm rbd2 rbd2 --yes-i-really-really-mean-it 2026-03-23T18:17:58.112 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd2' does not exist 2026-03-23T18:17:58.125 INFO:tasks.workunit.client.0.vm04.stderr:+ test_config 2026-03-23T18:17:58.126 INFO:tasks.workunit.client.0.vm04.stdout:testing config... 2026-03-23T18:17:58.126 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing config...' 2026-03-23T18:17:58.126 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-23T18:17:58.126 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:58.204 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:58.289 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:58.358 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:58.417 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:58.478 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:58.539 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:58.601 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:58.664 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:58.726 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:58.786 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:58.847 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:58.907 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:58.969 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:59.063 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:59.206 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:59.284 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:59.345 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:17:59.406 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd config global set osd rbd_cache true 2026-03-23T18:17:59.407 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config global set osd rbd_cache true 2026-03-23T18:17:59.421 INFO:tasks.workunit.client.0.vm04.stderr:rbd: invalid config entity: osd (must be global, client or client.) 2026-03-23T18:17:59.423 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:17:59.423 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd config global set global debug_ms 10 2026-03-23T18:17:59.423 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config global set global debug_ms 10 2026-03-23T18:17:59.438 INFO:tasks.workunit.client.0.vm04.stderr:rbd: not rbd option: debug_ms 2026-03-23T18:17:59.439 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:17:59.439 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd config global set global rbd_UNKNOWN false 2026-03-23T18:17:59.439 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config global set global rbd_UNKNOWN false 2026-03-23T18:17:59.454 INFO:tasks.workunit.client.0.vm04.stderr:rbd: invalid config key: rbd_UNKNOWN 2026-03-23T18:17:59.456 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:17:59.456 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd config global set global rbd_cache INVALID 2026-03-23T18:17:59.456 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config global set global rbd_cache INVALID 2026-03-23T18:17:59.476 INFO:tasks.workunit.client.0.vm04.stderr:rbd: error setting rbd_cache: error parsing value: Expected option value to be integer, got 'INVALID' 2026-03-23T18:17:59.479 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:17:59.479 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config global set global rbd_cache false 2026-03-23T18:17:59.509 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config global set client rbd_cache true 2026-03-23T18:17:59.536 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config global set client.123 rbd_cache false 2026-03-23T18:17:59.563 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config global get global rbd_cache 2026-03-23T18:17:59.563 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^false$' 2026-03-23T18:17:59.589 INFO:tasks.workunit.client.0.vm04.stdout:false 2026-03-23T18:17:59.590 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config global get client rbd_cache 2026-03-23T18:17:59.590 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^true$' 2026-03-23T18:17:59.614 INFO:tasks.workunit.client.0.vm04.stdout:true 2026-03-23T18:17:59.614 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config global get client.123 rbd_cache 2026-03-23T18:17:59.614 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^false$' 2026-03-23T18:17:59.638 INFO:tasks.workunit.client.0.vm04.stdout:false 2026-03-23T18:17:59.638 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd config global get client.UNKNOWN rbd_cache 2026-03-23T18:17:59.638 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config global get client.UNKNOWN rbd_cache 2026-03-23T18:17:59.662 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd_cache is not set 2026-03-23T18:17:59.666 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:17:59.666 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config global list global 2026-03-23T18:17:59.666 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^rbd_cache * false * global *$' 2026-03-23T18:17:59.689 INFO:tasks.workunit.client.0.vm04.stdout:rbd_cache false global 2026-03-23T18:17:59.689 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config global list client 2026-03-23T18:17:59.689 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^rbd_cache * true * client *$' 2026-03-23T18:17:59.713 INFO:tasks.workunit.client.0.vm04.stdout:rbd_cache true client 2026-03-23T18:17:59.713 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config global list client.123 2026-03-23T18:17:59.713 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^rbd_cache * false * client.123 *$' 2026-03-23T18:17:59.737 INFO:tasks.workunit.client.0.vm04.stdout:rbd_cache false client.123 2026-03-23T18:17:59.737 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config global list client.UNKNOWN 2026-03-23T18:17:59.737 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^rbd_cache * true * client *$' 2026-03-23T18:17:59.760 INFO:tasks.workunit.client.0.vm04.stdout:rbd_cache true client 2026-03-23T18:17:59.760 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config global rm client rbd_cache 2026-03-23T18:17:59.796 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd config global get client rbd_cache 2026-03-23T18:17:59.796 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config global get client rbd_cache 2026-03-23T18:17:59.816 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd_cache is not set 2026-03-23T18:17:59.819 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:17:59.819 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config global list client 2026-03-23T18:17:59.819 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^rbd_cache * false * global *$' 2026-03-23T18:17:59.842 INFO:tasks.workunit.client.0.vm04.stdout:rbd_cache false global 2026-03-23T18:17:59.842 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config global rm client.123 rbd_cache 2026-03-23T18:17:59.868 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config global rm global rbd_cache 2026-03-23T18:17:59.896 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config pool set rbd rbd_cache true 2026-03-23T18:17:59.931 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config pool list rbd 2026-03-23T18:17:59.931 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^rbd_cache * true * pool *$' 2026-03-23T18:17:59.957 INFO:tasks.workunit.client.0.vm04.stdout:rbd_cache true pool 2026-03-23T18:17:59.957 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config pool get rbd rbd_cache 2026-03-23T18:17:59.957 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^true$' 2026-03-23T18:17:59.981 INFO:tasks.workunit.client.0.vm04.stdout:true 2026-03-23T18:17:59.982 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create -s 1 test1 2026-03-23T18:18:00.287 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config image list rbd/test1 2026-03-23T18:18:00.287 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^rbd_cache * true * pool *$' 2026-03-23T18:18:00.324 INFO:tasks.workunit.client.0.vm04.stdout:rbd_cache true pool 2026-03-23T18:18:00.324 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config image set rbd/test1 rbd_cache false 2026-03-23T18:18:00.365 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config image list rbd/test1 2026-03-23T18:18:00.366 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^rbd_cache * false * image *$' 2026-03-23T18:18:00.399 INFO:tasks.workunit.client.0.vm04.stdout:rbd_cache false image 2026-03-23T18:18:00.399 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config image get rbd/test1 rbd_cache 2026-03-23T18:18:00.399 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^false$' 2026-03-23T18:18:00.430 INFO:tasks.workunit.client.0.vm04.stdout:false 2026-03-23T18:18:00.430 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config image remove rbd/test1 rbd_cache 2026-03-23T18:18:00.468 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd config image get rbd/test1 rbd_cache 2026-03-23T18:18:00.468 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config image get rbd/test1 rbd_cache 2026-03-23T18:18:00.494 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd_cache is not set 2026-03-23T18:18:00.499 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:18:00.499 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config image list rbd/test1 2026-03-23T18:18:00.499 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^rbd_cache * true * pool *$' 2026-03-23T18:18:00.531 INFO:tasks.workunit.client.0.vm04.stdout:rbd_cache true pool 2026-03-23T18:18:00.531 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config pool remove rbd rbd_cache 2026-03-23T18:18:00.565 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd config pool get rbd rbd_cache 2026-03-23T18:18:00.565 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config pool get rbd rbd_cache 2026-03-23T18:18:00.586 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd_cache is not set 2026-03-23T18:18:00.589 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:18:00.589 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config pool list rbd 2026-03-23T18:18:00.589 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^rbd_cache * true * config *$' 2026-03-23T18:18:00.614 INFO:tasks.workunit.client.0.vm04.stdout:rbd_cache true config 2026-03-23T18:18:00.615 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test1 2026-03-23T18:18:00.681 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:18:00.678+0000 7f5045c54640 0 -- 192.168.123.104:0/3057178858 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x561c2c0c8b50 msgr2=0x561c2c0fa700 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:18:00.682 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:18:00.685 INFO:tasks.workunit.client.0.vm04.stdout:testing import, export, resize, and snapshots... 2026-03-23T18:18:00.685 INFO:tasks.workunit.client.0.vm04.stderr:+ RBD_CREATE_ARGS= 2026-03-23T18:18:00.685 INFO:tasks.workunit.client.0.vm04.stderr:+ test_others 2026-03-23T18:18:00.686 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing import, export, resize, and snapshots...' 2026-03-23T18:18:00.686 INFO:tasks.workunit.client.0.vm04.stderr:+ TMP_FILES='/tmp/img1 /tmp/img1.new /tmp/img2 /tmp/img2.new /tmp/img3 /tmp/img3.new /tmp/img-diff1.new /tmp/img-diff2.new /tmp/img-diff3.new /tmp/img1.snap1 /tmp/img1.snap1 /tmp/img-diff1.snap1' 2026-03-23T18:18:00.686 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-23T18:18:00.686 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:00.749 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:00.811 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:00.874 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:00.937 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:01.000 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:01.063 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:01.206 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:01.335 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:01.420 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:01.521 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:01.601 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:01.667 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:01.731 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:01.796 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:01.858 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:01.922 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:01.987 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:02.051 INFO:tasks.workunit.client.0.vm04.stderr:+ rm -f /tmp/img1 /tmp/img1.new /tmp/img2 /tmp/img2.new /tmp/img3 /tmp/img3.new /tmp/img-diff1.new /tmp/img-diff2.new /tmp/img-diff3.new /tmp/img1.snap1 /tmp/img1.snap1 /tmp/img-diff1.snap1 2026-03-23T18:18:02.052 INFO:tasks.workunit.client.0.vm04.stderr:+ dd if=/bin/sh of=/tmp/img1 bs=1k count=1 seek=10 2026-03-23T18:18:02.053 INFO:tasks.workunit.client.0.vm04.stderr:1+0 records in 2026-03-23T18:18:02.053 INFO:tasks.workunit.client.0.vm04.stderr:1+0 records out 2026-03-23T18:18:02.053 INFO:tasks.workunit.client.0.vm04.stderr:1024 bytes (1.0 kB, 1.0 KiB) copied, 5.7878e-05 s, 17.7 MB/s 2026-03-23T18:18:02.053 INFO:tasks.workunit.client.0.vm04.stderr:+ dd if=/bin/dd of=/tmp/img1 bs=1k count=10 seek=100 2026-03-23T18:18:02.053 INFO:tasks.workunit.client.0.vm04.stderr:10+0 records in 2026-03-23T18:18:02.053 INFO:tasks.workunit.client.0.vm04.stderr:10+0 records out 2026-03-23T18:18:02.053 INFO:tasks.workunit.client.0.vm04.stderr:10240 bytes (10 kB, 10 KiB) copied, 7.1484e-05 s, 143 MB/s 2026-03-23T18:18:02.054 INFO:tasks.workunit.client.0.vm04.stderr:+ dd if=/bin/rm of=/tmp/img1 bs=1k count=100 seek=1000 2026-03-23T18:18:02.054 INFO:tasks.workunit.client.0.vm04.stderr:58+1 records in 2026-03-23T18:18:02.054 INFO:tasks.workunit.client.0.vm04.stderr:58+1 records out 2026-03-23T18:18:02.054 INFO:tasks.workunit.client.0.vm04.stderr:59912 bytes (60 kB, 59 KiB) copied, 0.000208321 s, 288 MB/s 2026-03-23T18:18:02.055 INFO:tasks.workunit.client.0.vm04.stderr:+ dd if=/bin/ls of=/tmp/img1 bs=1k seek=10000 2026-03-23T18:18:02.055 INFO:tasks.workunit.client.0.vm04.stderr:134+1 records in 2026-03-23T18:18:02.055 INFO:tasks.workunit.client.0.vm04.stderr:134+1 records out 2026-03-23T18:18:02.055 INFO:tasks.workunit.client.0.vm04.stderr:138216 bytes (138 kB, 135 KiB) copied, 0.000381755 s, 362 MB/s 2026-03-23T18:18:02.055 INFO:tasks.workunit.client.0.vm04.stderr:+ dd if=/bin/ln of=/tmp/img1 bs=1k seek=100000 2026-03-23T18:18:02.056 INFO:tasks.workunit.client.0.vm04.stderr:58+1 records in 2026-03-23T18:18:02.056 INFO:tasks.workunit.client.0.vm04.stderr:58+1 records out 2026-03-23T18:18:02.056 INFO:tasks.workunit.client.0.vm04.stderr:59912 bytes (60 kB, 59 KiB) copied, 0.000187141 s, 320 MB/s 2026-03-23T18:18:02.056 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd import /tmp/img1 testimg1 2026-03-23T18:18:02.194 INFO:tasks.workunit.client.0.vm04.stderr: Importing image: 4% complete... Importing image: 8% complete... Importing image: 12% complete... Importing image: 16% complete... Importing image: 20% complete... Importing image: 24% complete... Importing image: 28% complete... Importing image: 32% complete... Importing image: 36% complete... Importing image: 40% complete... Importing image: 45% complete... Importing image: 49% complete... Importing image: 53% complete... Importing image: 57% complete... Importing image: 61% complete... Importing image: 65% complete... Importing image: 69% complete... Importing image: 73% complete... Importing image: 77% complete... Importing image: 81% complete... Importing image: 85% complete... Importing image: 90% complete... Importing image: 94% complete... Importing image: 98% complete... Importing image: 100% complete...done. 2026-03-23T18:18:02.199 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd resize testimg1 --size=256 --allow-shrink 2026-03-23T18:18:02.235 INFO:tasks.workunit.client.0.vm04.stderr: Resizing image: 100% complete...done. 2026-03-23T18:18:02.243 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export testimg1 /tmp/img2 2026-03-23T18:18:02.336 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 1% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 12% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 23% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 37% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 48% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 62% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 73% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 87% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 98% complete... Exporting image: 100% complete...done. 2026-03-23T18:18:02.344 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create testimg1 --snap=snap1 2026-03-23T18:18:02.736 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-23T18:18:02.760 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd resize testimg1 --size=128 2026-03-23T18:18:02.794 INFO:tasks.workunit.client.0.vm04.stderr:rbd: shrinking an image is only allowed with the --allow-shrink flag 2026-03-23T18:18:02.799 INFO:tasks.workunit.client.0.vm04.stderr:+ true 2026-03-23T18:18:02.799 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd resize testimg1 --size=128 --allow-shrink 2026-03-23T18:18:02.838 INFO:tasks.workunit.client.0.vm04.stderr: Resizing image: 50% complete... Resizing image: 51% complete... Resizing image: 53% complete... Resizing image: 54% complete... Resizing image: 56% complete... Resizing image: 57% complete... Resizing image: 59% complete... Resizing image: 60% complete... Resizing image: 62% complete... Resizing image: 64% complete... Resizing image: 65% complete... Resizing image: 67% complete... Resizing image: 68% complete... Resizing image: 70% complete... Resizing image: 71% complete... Resizing image: 73% complete... Resizing image: 75% complete... Resizing image: 76% complete... Resizing image: 78% complete... Resizing image: 79% complete... Resizing image: 81% complete... Resizing image: 82% complete... Resizing image: 84% complete... Resizing image: 85% complete... Resizing image: 87% complete... Resizing image: 89% complete... Resizing image: 90% complete... Resizing image: 92% complete... Resizing image: 93% complete... Resizing image: 95% complete... Resizing image: 96% complete... Resizing image: 98% complete... Resizing image: 100% complete...done. 2026-03-23T18:18:02.852 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export testimg1 /tmp/img3 2026-03-23T18:18:02.948 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 3% complete... Exporting image: 6% complete... Exporting image: 9% complete... Exporting image: 12% complete... Exporting image: 15% complete... Exporting image: 18% complete... Exporting image: 21% complete... Exporting image: 25% complete... Exporting image: 28% complete... Exporting image: 31% complete... Exporting image: 34% complete... Exporting image: 37% complete... Exporting image: 40% complete... Exporting image: 43% complete... Exporting image: 46% complete... Exporting image: 50% complete... Exporting image: 53% complete... Exporting image: 56% complete... Exporting image: 59% complete... Exporting image: 62% complete... Exporting image: 65% complete... Exporting image: 68% complete... Exporting image: 71% complete... Exporting image: 75% complete... Exporting image: 78% complete... Exporting image: 81% complete... Exporting image: 84% complete... Exporting image: 87% complete... Exporting image: 90% complete... Exporting image: 93% complete... Exporting image: 96% complete... Exporting image: 100% complete...done. 2026-03-23T18:18:02.953 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg1 2026-03-23T18:18:02.953 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 128 MiB' 2026-03-23T18:18:02.981 INFO:tasks.workunit.client.0.vm04.stdout: size 128 MiB in 32 objects 2026-03-23T18:18:02.981 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info --snap=snap1 testimg1 2026-03-23T18:18:02.981 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 256 MiB' 2026-03-23T18:18:03.016 INFO:tasks.workunit.client.0.vm04.stdout: size 256 MiB in 64 objects 2026-03-23T18:18:03.016 INFO:tasks.workunit.client.0.vm04.stderr:+ rm -rf /tmp/diff-testimg1-1 /tmp/diff-testimg1-2 2026-03-23T18:18:03.016 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export-diff testimg1 --snap=snap1 /tmp/diff-testimg1-1 2026-03-23T18:18:03.060 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 3% complete... Exporting image: 37% complete... Exporting image: 100% complete...done. 2026-03-23T18:18:03.064 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export-diff testimg1 --from-snap=snap1 /tmp/diff-testimg1-2 2026-03-23T18:18:03.092 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 100% complete...done. 2026-03-23T18:18:03.095 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --size=1 testimg-diff1 2026-03-23T18:18:03.128 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd import-diff --sparse-size 8K /tmp/diff-testimg1-1 testimg-diff1 2026-03-23T18:18:03.594 INFO:tasks.workunit.client.0.vm04.stderr: Importing image diff: 22% complete... Importing image diff: 63% complete... Importing image diff: 99% complete... Importing image diff: 100% complete...done. 2026-03-23T18:18:03.603 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd import-diff --sparse-size 8K /tmp/diff-testimg1-2 testimg-diff1 2026-03-23T18:18:03.636 INFO:tasks.workunit.client.0.vm04.stderr: Importing image diff: 68% complete... Importing image diff: 96% complete... Importing image diff: 100% complete...done. 2026-03-23T18:18:03.644 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg1 2026-03-23T18:18:03.644 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 128 MiB' 2026-03-23T18:18:03.671 INFO:tasks.workunit.client.0.vm04.stdout: size 128 MiB in 32 objects 2026-03-23T18:18:03.671 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info --snap=snap1 testimg1 2026-03-23T18:18:03.671 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 256 MiB' 2026-03-23T18:18:03.701 INFO:tasks.workunit.client.0.vm04.stdout: size 256 MiB in 64 objects 2026-03-23T18:18:03.701 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg-diff1 2026-03-23T18:18:03.702 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 128 MiB' 2026-03-23T18:18:03.729 INFO:tasks.workunit.client.0.vm04.stdout: size 128 MiB in 32 objects 2026-03-23T18:18:03.730 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info --snap=snap1 testimg-diff1 2026-03-23T18:18:03.730 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 256 MiB' 2026-03-23T18:18:03.758 INFO:tasks.workunit.client.0.vm04.stdout: size 256 MiB in 64 objects 2026-03-23T18:18:03.758 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd copy testimg1 --snap=snap1 testimg2 2026-03-23T18:18:03.852 INFO:tasks.workunit.client.0.vm04.stderr: Image copy: 3% complete... Image copy: 37% complete... Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-23T18:18:03.857 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd copy testimg1 testimg3 2026-03-23T18:18:03.956 INFO:tasks.workunit.client.0.vm04.stderr: Image copy: 3% complete... Image copy: 6% complete... Image copy: 9% complete... Image copy: 12% complete... Image copy: 15% complete... Image copy: 18% complete... Image copy: 21% complete... Image copy: 25% complete... Image copy: 28% complete... Image copy: 31% complete... Image copy: 34% complete... Image copy: 37% complete... Image copy: 40% complete... Image copy: 43% complete... Image copy: 46% complete... Image copy: 50% complete... Image copy: 53% complete... Image copy: 56% complete... Image copy: 59% complete... Image copy: 62% complete... Image copy: 65% complete... Image copy: 68% complete... Image copy: 71% complete... Image copy: 75% complete... Image copy: 78% complete... Image copy: 81% complete... Image copy: 84% complete... Image copy: 87% complete... Image copy: 90% complete... Image copy: 93% complete... Image copy: 96% complete... Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-23T18:18:03.962 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd copy testimg-diff1 --sparse-size 768K --snap=snap1 testimg-diff2 2026-03-23T18:18:04.049 INFO:tasks.workunit.client.0.vm04.stderr: Image copy: 3% complete... Image copy: 37% complete... Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-23T18:18:04.054 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd copy testimg-diff1 --sparse-size 768K testimg-diff3 2026-03-23T18:18:04.146 INFO:tasks.workunit.client.0.vm04.stderr: Image copy: 3% complete... Image copy: 6% complete... Image copy: 9% complete... Image copy: 12% complete... Image copy: 15% complete... Image copy: 18% complete... Image copy: 21% complete... Image copy: 25% complete... Image copy: 28% complete... Image copy: 31% complete... Image copy: 34% complete... Image copy: 37% complete... Image copy: 40% complete... Image copy: 43% complete... Image copy: 46% complete... Image copy: 50% complete... Image copy: 53% complete... Image copy: 56% complete... Image copy: 59% complete... Image copy: 62% complete... Image copy: 65% complete... Image copy: 68% complete... Image copy: 71% complete... Image copy: 75% complete... Image copy: 78% complete... Image copy: 81% complete... Image copy: 84% complete... Image copy: 87% complete... Image copy: 90% complete... Image copy: 93% complete... Image copy: 96% complete... Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-23T18:18:04.152 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg2 2026-03-23T18:18:04.152 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 256 MiB' 2026-03-23T18:18:04.182 INFO:tasks.workunit.client.0.vm04.stdout: size 256 MiB in 64 objects 2026-03-23T18:18:04.183 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg3 2026-03-23T18:18:04.183 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 128 MiB' 2026-03-23T18:18:04.212 INFO:tasks.workunit.client.0.vm04.stdout: size 128 MiB in 32 objects 2026-03-23T18:18:04.212 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg-diff2 2026-03-23T18:18:04.212 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 256 MiB' 2026-03-23T18:18:04.240 INFO:tasks.workunit.client.0.vm04.stdout: size 256 MiB in 64 objects 2026-03-23T18:18:04.240 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg-diff3 2026-03-23T18:18:04.240 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 128 MiB' 2026-03-23T18:18:04.268 INFO:tasks.workunit.client.0.vm04.stdout: size 128 MiB in 32 objects 2026-03-23T18:18:04.269 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd deep copy testimg1 testimg4 2026-03-23T18:18:04.655 INFO:tasks.workunit.client.0.vm04.stderr: Image deep copy: 1% complete... Image deep copy: 3% complete... Image deep copy: 4% complete... Image deep copy: 6% complete... Image deep copy: 7% complete... Image deep copy: 9% complete... Image deep copy: 10% complete... Image deep copy: 12% complete... Image deep copy: 14% complete... Image deep copy: 15% complete... Image deep copy: 17% complete... Image deep copy: 18% complete... Image deep copy: 20% complete... Image deep copy: 21% complete... Image deep copy: 23% complete... Image deep copy: 25% complete... Image deep copy: 26% complete... Image deep copy: 28% complete... Image deep copy: 29% complete... Image deep copy: 31% complete... Image deep copy: 32% complete... Image deep copy: 34% complete... Image deep copy: 35% complete... Image deep copy: 37% complete... Image deep copy: 39% complete... Image deep copy: 40% complete... Image deep copy: 42% complete... Image deep copy: 43% complete... Image deep copy: 45% complete... Image deep copy: 46% complete... Image deep copy: 48% complete... Image deep copy: 50% complete... Image deep copy: 51% complete... Image deep copy: 53% complete... Image deep copy: 54% complete... Image deep copy: 56% complete... Image deep copy: 57% complete... Image deep copy: 59% complete... Image deep copy: 60% complete... Image deep copy: 62% complete... Image deep copy: 64% complete... Image deep copy: 65% complete... Image deep copy: 67% complete... Image deep copy: 68% complete... Image deep copy: 70% complete... Image deep copy: 71% complete... Image deep copy: 73% complete... Image deep copy: 75% complete... Image deep copy: 76% complete... Image deep copy: 78% complete... Image deep copy: 79% complete... Image deep copy: 81% complete... Image deep copy: 82% complete... Image deep copy: 84% complete... Image deep copy: 85% complete... Image deep copy: 87% complete... Image deep copy: 89% complete... Image deep copy: 90% complete... Image deep copy: 92% complete... Image deep copy: 93% complete... Image deep copy: 95% complete... Image deep copy: 96% complete... Image deep copy: 98% complete... Image deep copy: 100% complete... Image deep copy: 100% complete...done. 2026-03-23T18:18:04.662 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd deep copy testimg1 --snap=snap1 testimg5 2026-03-23T18:18:05.280 INFO:tasks.workunit.client.0.vm04.stderr: Image deep copy: 1% complete... Image deep copy: 3% complete... Image deep copy: 4% complete... Image deep copy: 6% complete... Image deep copy: 7% complete... Image deep copy: 9% complete... Image deep copy: 10% complete... Image deep copy: 12% complete... Image deep copy: 14% complete... Image deep copy: 15% complete... Image deep copy: 17% complete... Image deep copy: 18% complete... Image deep copy: 20% complete... Image deep copy: 21% complete... Image deep copy: 23% complete... Image deep copy: 25% complete... Image deep copy: 26% complete... Image deep copy: 28% complete... Image deep copy: 29% complete... Image deep copy: 31% complete... Image deep copy: 32% complete... Image deep copy: 34% complete... Image deep copy: 35% complete... Image deep copy: 37% complete... Image deep copy: 39% complete... Image deep copy: 40% complete... Image deep copy: 42% complete... Image deep copy: 43% complete... Image deep copy: 45% complete... Image deep copy: 46% complete... Image deep copy: 48% complete... Image deep copy: 50% complete... Image deep copy: 51% complete... Image deep copy: 53% complete... Image deep copy: 54% complete... Image deep copy: 56% complete... Image deep copy: 57% complete... Image deep copy: 59% complete... Image deep copy: 60% complete... Image deep copy: 62% complete... Image deep copy: 64% complete... Image deep copy: 65% complete... Image deep copy: 67% complete... Image deep copy: 68% complete... Image deep copy: 70% complete... Image deep copy: 71% complete... Image deep copy: 73% complete... Image deep copy: 75% complete... Image deep copy: 76% complete... Image deep copy: 78% complete... Image deep copy: 79% complete... Image deep copy: 81% complete... Image deep copy: 82% complete... Image deep copy: 84% complete... Image deep copy: 85% complete... Image deep copy: 87% complete... Image deep copy: 89% complete... Image deep copy: 90% complete... Image deep copy: 92% complete... Image deep copy: 93% complete... Image deep copy: 95% complete... Image deep copy: 96% complete... Image deep copy: 98% complete... Image deep copy: 100% complete... Image deep copy: 100% complete...done. 2026-03-23T18:18:05.285 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg4 2026-03-23T18:18:05.285 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 128 MiB' 2026-03-23T18:18:05.321 INFO:tasks.workunit.client.0.vm04.stdout: size 128 MiB in 32 objects 2026-03-23T18:18:05.321 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg5 2026-03-23T18:18:05.321 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 256 MiB' 2026-03-23T18:18:05.354 INFO:tasks.workunit.client.0.vm04.stdout: size 256 MiB in 64 objects 2026-03-23T18:18:05.354 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap ls testimg4 2026-03-23T18:18:05.354 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -v SNAPID 2026-03-23T18:18:05.354 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:18:05.354 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 1 2026-03-23T18:18:05.385 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-23T18:18:05.385 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap ls testimg4 2026-03-23T18:18:05.385 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '.*snap1.*' 2026-03-23T18:18:05.428 INFO:tasks.workunit.client.0.vm04.stdout: 12 snap1 256 MiB Mon Mar 23 18:18:04 2026 2026-03-23T18:18:05.428 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export testimg1 /tmp/img1.new 2026-03-23T18:18:05.503 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 3% complete... Exporting image: 6% complete... Exporting image: 9% complete... Exporting image: 12% complete... Exporting image: 15% complete... Exporting image: 18% complete... Exporting image: 21% complete... Exporting image: 25% complete... Exporting image: 28% complete... Exporting image: 31% complete... Exporting image: 34% complete... Exporting image: 37% complete... Exporting image: 40% complete... Exporting image: 43% complete... Exporting image: 46% complete... Exporting image: 50% complete... Exporting image: 53% complete... Exporting image: 56% complete... Exporting image: 59% complete... Exporting image: 62% complete... Exporting image: 65% complete... Exporting image: 68% complete... Exporting image: 71% complete... Exporting image: 75% complete... Exporting image: 78% complete... Exporting image: 81% complete... Exporting image: 84% complete... Exporting image: 87% complete... Exporting image: 90% complete... Exporting image: 93% complete... Exporting image: 96% complete... Exporting image: 100% complete...done. 2026-03-23T18:18:05.508 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export testimg2 /tmp/img2.new 2026-03-23T18:18:05.585 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 1% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 12% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 23% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 37% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 48% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 62% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 73% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 87% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 98% complete... Exporting image: 100% complete...done. 2026-03-23T18:18:05.592 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export testimg3 /tmp/img3.new 2026-03-23T18:18:05.652 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 3% complete... Exporting image: 6% complete... Exporting image: 9% complete... Exporting image: 12% complete... Exporting image: 15% complete... Exporting image: 18% complete... Exporting image: 21% complete... Exporting image: 25% complete... Exporting image: 28% complete... Exporting image: 31% complete... Exporting image: 34% complete... Exporting image: 37% complete... Exporting image: 40% complete... Exporting image: 43% complete... Exporting image: 46% complete... Exporting image: 50% complete... Exporting image: 53% complete... Exporting image: 56% complete... Exporting image: 59% complete... Exporting image: 62% complete... Exporting image: 65% complete... Exporting image: 68% complete... Exporting image: 71% complete... Exporting image: 75% complete... Exporting image: 78% complete... Exporting image: 81% complete... Exporting image: 84% complete... Exporting image: 87% complete... Exporting image: 90% complete... Exporting image: 93% complete... Exporting image: 96% complete... Exporting image: 100% complete...done. 2026-03-23T18:18:05.656 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export testimg-diff1 /tmp/img-diff1.new 2026-03-23T18:18:05.711 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 3% complete... Exporting image: 6% complete... Exporting image: 9% complete... Exporting image: 12% complete... Exporting image: 15% complete... Exporting image: 18% complete... Exporting image: 21% complete... Exporting image: 25% complete... Exporting image: 28% complete... Exporting image: 31% complete... Exporting image: 34% complete... Exporting image: 37% complete... Exporting image: 40% complete... Exporting image: 43% complete... Exporting image: 46% complete... Exporting image: 50% complete... Exporting image: 53% complete... Exporting image: 56% complete... Exporting image: 59% complete... Exporting image: 62% complete... Exporting image: 65% complete... Exporting image: 68% complete... Exporting image: 71% complete... Exporting image: 75% complete... Exporting image: 78% complete... Exporting image: 81% complete... Exporting image: 84% complete... Exporting image: 87% complete... Exporting image: 90% complete... Exporting image: 93% complete... Exporting image: 96% complete... Exporting image: 100% complete...done. 2026-03-23T18:18:05.716 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export testimg-diff2 /tmp/img-diff2.new 2026-03-23T18:18:05.784 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 1% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 12% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 23% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 37% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 48% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 62% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 73% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 87% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 98% complete... Exporting image: 100% complete...done. 2026-03-23T18:18:05.791 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export testimg-diff3 /tmp/img-diff3.new 2026-03-23T18:18:05.849 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 3% complete... Exporting image: 6% complete... Exporting image: 9% complete... Exporting image: 12% complete... Exporting image: 15% complete... Exporting image: 18% complete... Exporting image: 21% complete... Exporting image: 25% complete... Exporting image: 28% complete... Exporting image: 31% complete... Exporting image: 34% complete... Exporting image: 37% complete... Exporting image: 40% complete... Exporting image: 43% complete... Exporting image: 46% complete... Exporting image: 50% complete... Exporting image: 53% complete... Exporting image: 56% complete... Exporting image: 59% complete... Exporting image: 62% complete... Exporting image: 65% complete... Exporting image: 68% complete... Exporting image: 71% complete... Exporting image: 75% complete... Exporting image: 78% complete... Exporting image: 81% complete... Exporting image: 84% complete... Exporting image: 87% complete... Exporting image: 90% complete... Exporting image: 93% complete... Exporting image: 96% complete... Exporting image: 100% complete...done. 2026-03-23T18:18:05.855 INFO:tasks.workunit.client.0.vm04.stderr:+ cmp /tmp/img2 /tmp/img2.new 2026-03-23T18:18:06.071 INFO:tasks.workunit.client.0.vm04.stderr:+ cmp /tmp/img3 /tmp/img3.new 2026-03-23T18:18:06.179 INFO:tasks.workunit.client.0.vm04.stderr:+ cmp /tmp/img2 /tmp/img-diff2.new 2026-03-23T18:18:06.335 INFO:tasks.workunit.client.0.vm04.stderr:+ cmp /tmp/img3 /tmp/img-diff3.new 2026-03-23T18:18:06.407 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rollback --snap=snap1 testimg1 2026-03-23T18:18:06.465 INFO:tasks.workunit.client.0.vm04.stderr: Rolling back to snapshot: 1% complete... Rolling back to snapshot: 3% complete... Rolling back to snapshot: 4% complete... Rolling back to snapshot: 6% complete... Rolling back to snapshot: 7% complete... Rolling back to snapshot: 9% complete... Rolling back to snapshot: 10% complete... Rolling back to snapshot: 12% complete... Rolling back to snapshot: 14% complete... Rolling back to snapshot: 15% complete... Rolling back to snapshot: 17% complete... Rolling back to snapshot: 18% complete... Rolling back to snapshot: 20% complete... Rolling back to snapshot: 21% complete... Rolling back to snapshot: 23% complete... Rolling back to snapshot: 25% complete... Rolling back to snapshot: 26% complete... Rolling back to snapshot: 28% complete... Rolling back to snapshot: 29% complete... Rolling back to snapshot: 31% complete... Rolling back to snapshot: 32% complete... Rolling back to snapshot: 34% complete... Rolling back to snapshot: 35% complete... Rolling back to snapshot: 37% complete... Rolling back to snapshot: 39% complete... Rolling back to snapshot: 40% complete... Rolling back to snapshot: 42% complete... Rolling back to snapshot: 43% complete... Rolling back to snapshot: 45% complete... Rolling back to snapshot: 46% complete... Rolling back to snapshot: 48% complete... Rolling back to snapshot: 50% complete... Rolling back to snapshot: 51% complete... Rolling back to snapshot: 53% complete... Rolling back to snapshot: 54% complete... Rolling back to snapshot: 56% complete... Rolling back to snapshot: 57% complete... Rolling back to snapshot: 59% complete... Rolling back to snapshot: 60% complete... Rolling back to snapshot: 62% complete... Rolling back to snapshot: 64% complete... Rolling back to snapshot: 65% complete... Rolling back to snapshot: 67% complete... Rolling back to snapshot: 68% complete... Rolling back to snapshot: 70% complete... Rolling back to snapshot: 71% complete... Rolling back to snapshot: 73% complete... Rolling back to snapshot: 75% complete... Rolling back to snapshot: 76% complete... Rolling back to snapshot: 78% complete... Rolling back to snapshot: 79% complete... Rolling back to snapshot: 81% complete... Rolling back to snapshot: 82% complete... Rolling back to snapshot: 84% complete... Rolling back to snapshot: 85% complete... Rolling back to snapshot: 87% complete... Rolling back to snapshot: 89% complete... Rolling back to snapshot: 90% complete... Rolling back to snapshot: 92% complete... Rolling back to snapshot: 93% complete... Rolling back to snapshot: 95% complete... Rolling back to snapshot: 96% complete... Rolling back to snapshot: 98% complete... Rolling back to snapshot: 100% complete...done. 2026-03-23T18:18:06.473 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rollback --snap=snap1 testimg-diff1 2026-03-23T18:18:06.522 INFO:tasks.workunit.client.0.vm04.stderr: Rolling back to snapshot: 1% complete... Rolling back to snapshot: 3% complete... Rolling back to snapshot: 4% complete... Rolling back to snapshot: 6% complete... Rolling back to snapshot: 7% complete... Rolling back to snapshot: 9% complete... Rolling back to snapshot: 10% complete... Rolling back to snapshot: 12% complete... Rolling back to snapshot: 14% complete... Rolling back to snapshot: 15% complete... Rolling back to snapshot: 17% complete... Rolling back to snapshot: 18% complete... Rolling back to snapshot: 20% complete... Rolling back to snapshot: 21% complete... Rolling back to snapshot: 23% complete... Rolling back to snapshot: 25% complete... Rolling back to snapshot: 26% complete... Rolling back to snapshot: 28% complete... Rolling back to snapshot: 29% complete... Rolling back to snapshot: 31% complete... Rolling back to snapshot: 32% complete... Rolling back to snapshot: 34% complete... Rolling back to snapshot: 35% complete... Rolling back to snapshot: 37% complete... Rolling back to snapshot: 39% complete... Rolling back to snapshot: 40% complete... Rolling back to snapshot: 42% complete... Rolling back to snapshot: 43% complete... Rolling back to snapshot: 45% complete... Rolling back to snapshot: 46% complete... Rolling back to snapshot: 48% complete... Rolling back to snapshot: 50% complete... Rolling back to snapshot: 51% complete... Rolling back to snapshot: 53% complete... Rolling back to snapshot: 54% complete... Rolling back to snapshot: 56% complete... Rolling back to snapshot: 57% complete... Rolling back to snapshot: 59% complete... Rolling back to snapshot: 60% complete... Rolling back to snapshot: 62% complete... Rolling back to snapshot: 64% complete... Rolling back to snapshot: 65% complete... Rolling back to snapshot: 67% complete... Rolling back to snapshot: 68% complete... Rolling back to snapshot: 70% complete... Rolling back to snapshot: 71% complete... Rolling back to snapshot: 73% complete... Rolling back to snapshot: 75% complete... Rolling back to snapshot: 76% complete... Rolling back to snapshot: 78% complete... Rolling back to snapshot: 79% complete... Rolling back to snapshot: 81% complete... Rolling back to snapshot: 82% complete... Rolling back to snapshot: 84% complete... Rolling back to snapshot: 85% complete... Rolling back to snapshot: 87% complete... Rolling back to snapshot: 89% complete... Rolling back to snapshot: 90% complete... Rolling back to snapshot: 92% complete... Rolling back to snapshot: 93% complete... Rolling back to snapshot: 95% complete... Rolling back to snapshot: 96% complete... Rolling back to snapshot: 98% complete... Rolling back to snapshot: 100% complete...done. 2026-03-23T18:18:06.529 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg1 2026-03-23T18:18:06.529 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 256 MiB' 2026-03-23T18:18:06.556 INFO:tasks.workunit.client.0.vm04.stdout: size 256 MiB in 64 objects 2026-03-23T18:18:06.556 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg-diff1 2026-03-23T18:18:06.556 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 256 MiB' 2026-03-23T18:18:06.583 INFO:tasks.workunit.client.0.vm04.stdout: size 256 MiB in 64 objects 2026-03-23T18:18:06.583 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export testimg1 /tmp/img1.snap1 2026-03-23T18:18:06.662 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 1% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 12% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 23% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 37% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 48% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 62% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 73% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 87% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 98% complete... Exporting image: 100% complete...done. 2026-03-23T18:18:06.667 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export testimg-diff1 /tmp/img-diff1.snap1 2026-03-23T18:18:06.740 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 1% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 12% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 23% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 37% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 48% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 62% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 73% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 87% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 98% complete... Exporting image: 100% complete...done. 2026-03-23T18:18:06.745 INFO:tasks.workunit.client.0.vm04.stderr:+ cmp /tmp/img2 /tmp/img1.snap1 2026-03-23T18:18:06.941 INFO:tasks.workunit.client.0.vm04.stderr:+ cmp /tmp/img2 /tmp/img-diff1.snap1 2026-03-23T18:18:07.368 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm testimg2 2026-03-23T18:18:07.454 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 1% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 12% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 23% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 37% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 48% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 62% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 73% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 87% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 98% complete...2026-03-23T18:18:07.450+0000 7f3daed55640 0 -- 192.168.123.104:0/3912105746 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7f3d88008d30 msgr2=0x7f3d880291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:18:07.462 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:18:07.466 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm testimg3 2026-03-23T18:18:07.601 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete...2026-03-23T18:18:07.598+0000 7f8fb64fa640 0 -- 192.168.123.104:0/2823714993 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7f8f9405c510 msgr2=0x7f8f9407c910 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:18:07.619 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:18:07.626 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create testimg2 -s 0 2026-03-23T18:18:07.714 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd cp testimg2 testimg3 2026-03-23T18:18:07.805 INFO:tasks.workunit.client.0.vm04.stderr: Image copy: 100% complete...done. 2026-03-23T18:18:07.809 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd deep cp testimg2 testimg6 2026-03-23T18:18:07.871 INFO:tasks.workunit.client.0.vm04.stderr: Image deep copy: 100% complete...done. 2026-03-23T18:18:07.875 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm --snap=snap1 testimg1 2026-03-23T18:18:08.304 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-23T18:18:08.353 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm --snap=snap1 testimg-diff1 2026-03-23T18:18:09.330 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-23T18:18:09.354 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info --snap=snap1 testimg1 2026-03-23T18:18:09.354 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'error setting snapshot context: (2) No such file or directory' 2026-03-23T18:18:09.400 INFO:tasks.workunit.client.0.vm04.stdout:error setting snapshot context: (2) No such file or directory 2026-03-23T18:18:09.400 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info --snap=snap1 testimg-diff1 2026-03-23T18:18:09.401 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'error setting snapshot context: (2) No such file or directory' 2026-03-23T18:18:09.448 INFO:tasks.workunit.client.0.vm04.stdout:error setting snapshot context: (2) No such file or directory 2026-03-23T18:18:09.450 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd sparsify testimg1 2026-03-23T18:18:09.568 INFO:tasks.workunit.client.0.vm04.stderr: Image sparsify: 1% complete... Image sparsify: 3% complete... Image sparsify: 4% complete... Image sparsify: 6% complete... Image sparsify: 7% complete... Image sparsify: 9% complete... Image sparsify: 10% complete... Image sparsify: 12% complete... Image sparsify: 14% complete... Image sparsify: 15% complete... Image sparsify: 17% complete... Image sparsify: 18% complete... Image sparsify: 20% complete... Image sparsify: 21% complete... Image sparsify: 23% complete... Image sparsify: 25% complete... Image sparsify: 26% complete... Image sparsify: 28% complete... Image sparsify: 29% complete... Image sparsify: 31% complete... Image sparsify: 32% complete... Image sparsify: 34% complete... Image sparsify: 35% complete... Image sparsify: 37% complete... Image sparsify: 39% complete... Image sparsify: 40% complete... Image sparsify: 42% complete... Image sparsify: 43% complete... Image sparsify: 45% complete... Image sparsify: 46% complete... Image sparsify: 48% complete... Image sparsify: 50% complete... Image sparsify: 51% complete... Image sparsify: 53% complete... Image sparsify: 54% complete... Image sparsify: 56% complete... Image sparsify: 57% complete... Image sparsify: 59% complete... Image sparsify: 60% complete... Image sparsify: 62% complete... Image sparsify: 64% complete... Image sparsify: 65% complete... Image sparsify: 67% complete... Image sparsify: 68% complete... Image sparsify: 70% complete... Image sparsify: 71% complete... Image sparsify: 73% complete... Image sparsify: 75% complete... Image sparsify: 76% complete... Image sparsify: 78% complete... Image sparsify: 79% complete... Image sparsify: 81% complete... Image sparsify: 82% complete... Image sparsify: 84% complete... Image sparsify: 85% complete... Image sparsify: 87% complete... Image sparsify: 89% complete... Image sparsify: 90% complete... Image sparsify: 92% complete... Image sparsify: 93% complete... Image sparsify: 95% complete... Image sparsify: 96% complete... Image sparsify: 98% complete... Image sparsify: 100% complete...done. 2026-03-23T18:18:09.610 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-23T18:18:09.610 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:09.889 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:10.045 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:10.146 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:10.405 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:11.408 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:11.509 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:11.618 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:11.730 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:11.847 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:11.920 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:12.005 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:12.107 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:12.178 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:12.296 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:12.365 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:12.453 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:12.556 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:12.634 INFO:tasks.workunit.client.0.vm04.stderr:+ rm -f /tmp/img1 /tmp/img1.new /tmp/img2 /tmp/img2.new /tmp/img3 /tmp/img3.new /tmp/img-diff1.new /tmp/img-diff2.new /tmp/img-diff3.new /tmp/img1.snap1 /tmp/img1.snap1 /tmp/img-diff1.snap1 2026-03-23T18:18:12.774 INFO:tasks.workunit.client.0.vm04.stdout:testing locking... 2026-03-23T18:18:12.775 INFO:tasks.workunit.client.0.vm04.stderr:+ test_locking 2026-03-23T18:18:12.775 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing locking...' 2026-03-23T18:18:12.775 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-23T18:18:12.775 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:12.846 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:12.926 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:13.066 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:13.156 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:13.243 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:13.374 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:13.485 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:13.559 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:13.674 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:13.782 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:13.846 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:13.923 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:14.018 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:14.120 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:14.185 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:14.252 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:14.393 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:14.493 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create -s 1 test1 2026-03-23T18:18:14.529 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock list test1 2026-03-23T18:18:14.529 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:18:14.529 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^0$' 2026-03-23T18:18:14.555 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-23T18:18:14.555 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock add test1 id 2026-03-23T18:18:14.588 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock list test1 2026-03-23T18:18:14.588 INFO:tasks.workunit.client.0.vm04.stderr:+ grep ' 1 ' 2026-03-23T18:18:14.617 INFO:tasks.workunit.client.0.vm04.stdout:There is 1 exclusive lock on this image. 2026-03-23T18:18:14.618 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd lock list test1 2026-03-23T18:18:14.618 INFO:tasks.workunit.client.0.vm04.stderr:++ tail -n 1 2026-03-23T18:18:14.619 INFO:tasks.workunit.client.0.vm04.stderr:++ awk '{print $1;}' 2026-03-23T18:18:14.652 INFO:tasks.workunit.client.0.vm04.stderr:+ LOCKER=client.7758 2026-03-23T18:18:14.652 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock remove test1 id client.7758 2026-03-23T18:18:14.813 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock list test1 2026-03-23T18:18:14.813 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:18:14.813 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^0$' 2026-03-23T18:18:14.842 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-23T18:18:14.842 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock add test1 id --shared tag 2026-03-23T18:18:14.877 INFO:tasks.workunit.client.0.vm04.stderr:+ grep ' 1 ' 2026-03-23T18:18:14.877 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock list test1 2026-03-23T18:18:14.905 INFO:tasks.workunit.client.0.vm04.stdout:There is 1 shared lock on this image. 2026-03-23T18:18:14.905 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock add test1 id --shared tag 2026-03-23T18:18:14.940 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock list test1 2026-03-23T18:18:14.940 INFO:tasks.workunit.client.0.vm04.stderr:+ grep ' 2 ' 2026-03-23T18:18:14.968 INFO:tasks.workunit.client.0.vm04.stdout:There are 2 shared locks on this image. 2026-03-23T18:18:14.968 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock add test1 id2 --shared tag 2026-03-23T18:18:15.002 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock list test1 2026-03-23T18:18:15.002 INFO:tasks.workunit.client.0.vm04.stderr:+ grep ' 3 ' 2026-03-23T18:18:15.038 INFO:tasks.workunit.client.0.vm04.stdout:There are 3 shared locks on this image. 2026-03-23T18:18:15.050 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock list test1 2026-03-23T18:18:15.050 INFO:tasks.workunit.client.0.vm04.stderr:+ tail -n 1 2026-03-23T18:18:15.050 INFO:tasks.workunit.client.0.vm04.stderr:+ awk '{print $2, $1;}' 2026-03-23T18:18:15.050 INFO:tasks.workunit.client.0.vm04.stderr:+ xargs rbd lock remove test1 2026-03-23T18:18:16.105 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info test1 2026-03-23T18:18:16.105 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -qE 'features:.*exclusive' 2026-03-23T18:18:16.139 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd lock list test1 2026-03-23T18:18:16.185 INFO:tasks.workunit.client.0.vm04.stderr:+ '[' -n 'There are 2 shared locks on this image. 2026-03-23T18:18:16.185 INFO:tasks.workunit.client.0.vm04.stderr:Lock tag: tag 2026-03-23T18:18:16.185 INFO:tasks.workunit.client.0.vm04.stderr:Locker ID Address 2026-03-23T18:18:16.185 INFO:tasks.workunit.client.0.vm04.stderr:client.7773 id 192.168.123.104:0/2794231221 2026-03-23T18:18:16.185 INFO:tasks.workunit.client.0.vm04.stderr:client.7779 id 192.168.123.104:0/4142589643' ']' 2026-03-23T18:18:16.185 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock list test1 2026-03-23T18:18:16.185 INFO:tasks.workunit.client.0.vm04.stderr:+ tail -n 1 2026-03-23T18:18:16.185 INFO:tasks.workunit.client.0.vm04.stderr:+ awk '{print $2, $1;}' 2026-03-23T18:18:16.185 INFO:tasks.workunit.client.0.vm04.stderr:+ xargs rbd lock remove test1 2026-03-23T18:18:17.164 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd lock list test1 2026-03-23T18:18:17.213 INFO:tasks.workunit.client.0.vm04.stderr:+ '[' -n 'There is 1 shared lock on this image. 2026-03-23T18:18:17.213 INFO:tasks.workunit.client.0.vm04.stderr:Lock tag: tag 2026-03-23T18:18:17.213 INFO:tasks.workunit.client.0.vm04.stderr:Locker ID Address 2026-03-23T18:18:17.213 INFO:tasks.workunit.client.0.vm04.stderr:client.7773 id 192.168.123.104:0/2794231221' ']' 2026-03-23T18:18:17.214 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock list test1 2026-03-23T18:18:17.214 INFO:tasks.workunit.client.0.vm04.stderr:+ tail -n 1 2026-03-23T18:18:17.214 INFO:tasks.workunit.client.0.vm04.stderr:+ awk '{print $2, $1;}' 2026-03-23T18:18:17.214 INFO:tasks.workunit.client.0.vm04.stderr:+ xargs rbd lock remove test1 2026-03-23T18:18:18.133 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd lock list test1 2026-03-23T18:18:18.164 INFO:tasks.workunit.client.0.vm04.stderr:+ '[' -n '' ']' 2026-03-23T18:18:18.164 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test1 2026-03-23T18:18:18.251 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:18:18.250+0000 7f000f4ff640 0 -- 192.168.123.104:0/2541899104 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x556b3f4ce8b0 msgr2=0x556b3f4eed30 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:18:18.256 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:18:18.261 INFO:tasks.workunit.client.0.vm04.stdout:testing thick provision... 2026-03-23T18:18:18.261 INFO:tasks.workunit.client.0.vm04.stderr:+ test_thick_provision 2026-03-23T18:18:18.261 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing thick provision...' 2026-03-23T18:18:18.261 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-23T18:18:18.261 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:18.334 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:18.402 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:18.473 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:18.944 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:19.015 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:19.082 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:19.153 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:19.218 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:19.282 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:19.349 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:19.415 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:19.479 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:19.546 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:19.610 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:19.675 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:19.742 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:19.822 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:19.914 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --thick-provision -s 64M test1 2026-03-23T18:18:20.384 INFO:tasks.workunit.client.0.vm04.stderr: Thick provisioning: 6% complete... Thick provisioning: 12% complete... Thick provisioning: 18% complete... Thick provisioning: 25% complete... Thick provisioning: 31% complete... Thick provisioning: 37% complete... Thick provisioning: 43% complete... Thick provisioning: 50% complete... Thick provisioning: 56% complete... Thick provisioning: 62% complete... Thick provisioning: 68% complete... Thick provisioning: 75% complete... Thick provisioning: 81% complete... Thick provisioning: 87% complete... Thick provisioning: 93% complete... Thick provisioning: 100% complete... Thick provisioning: 100% complete...done. 2026-03-23T18:18:20.396 INFO:tasks.workunit.client.0.vm04.stderr:+ count=0 2026-03-23T18:18:20.396 INFO:tasks.workunit.client.0.vm04.stderr:+ ret= 2026-03-23T18:18:20.396 INFO:tasks.workunit.client.0.vm04.stderr:+ '[' 0 -lt 10 ']' 2026-03-23T18:18:20.396 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd du 2026-03-23T18:18:20.396 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test1 2026-03-23T18:18:20.396 INFO:tasks.workunit.client.0.vm04.stderr:+ tr -s ' ' 2026-03-23T18:18:20.396 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^64 MiB' 2026-03-23T18:18:20.397 INFO:tasks.workunit.client.0.vm04.stderr:+ cut -d ' ' -f 4-5 2026-03-23T18:18:20.436 INFO:tasks.workunit.client.0.vm04.stdout:64 MiB 2026-03-23T18:18:20.436 INFO:tasks.workunit.client.0.vm04.stderr:+ ret=0 2026-03-23T18:18:20.436 INFO:tasks.workunit.client.0.vm04.stderr:+ '[' 0 = 0 ']' 2026-03-23T18:18:20.436 INFO:tasks.workunit.client.0.vm04.stderr:+ break 2026-03-23T18:18:20.436 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd du 2026-03-23T18:18:20.468 INFO:tasks.workunit.client.0.vm04.stdout:NAME PROVISIONED USED 2026-03-23T18:18:20.468 INFO:tasks.workunit.client.0.vm04.stdout:test1 64 MiB 64 MiB 2026-03-23T18:18:20.472 INFO:tasks.workunit.client.0.vm04.stderr:+ '[' 0 '!=' 0 ']' 2026-03-23T18:18:20.474 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test1 2026-03-23T18:18:20.582 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 6% complete... Removing image: 12% complete... Removing image: 18% complete... Removing image: 25% complete... Removing image: 31% complete... Removing image: 37% complete... Removing image: 43% complete... Removing image: 50% complete... Removing image: 56% complete... Removing image: 62% complete... Removing image: 68% complete... Removing image: 75% complete... Removing image: 81% complete... Removing image: 87% complete... Removing image: 93% complete...2026-03-23T18:18:20.578+0000 7f5eea1ba640 0 -- 192.168.123.104:0/1887776784 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x5633aea7a090 msgr2=0x5633aeb5dca0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:18:20.608 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:18:20.606+0000 7f5eea1ba640 0 -- 192.168.123.104:0/1887776784 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7f5ec805c9a0 msgr2=0x7f5ec807cda0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:18:20.612 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:18:20.616 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-23T18:18:20.616 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test1 2026-03-23T18:18:20.616 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:18:20.616 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^0$' 2026-03-23T18:18:20.642 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-23T18:18:20.642 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --thick-provision -s 4G test1 2026-03-23T18:18:21.652 INFO:tasks.workunit.client.0.vm04.stderr: Thick provisioning: 1% complete... Thick provisioning: 2% complete... Thick provisioning: 3% complete... Thick provisioning: 4% complete... Thick provisioning: 5% complete...2026-03-23T18:18:21.650+0000 7fe1c258c640 0 -- 192.168.123.104:0/3181771445 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x555a95bcf800 msgr2=0x555a95befc80 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:18:21.917 INFO:tasks.workunit.client.0.vm04.stderr: Thick provisioning: 6% complete...2026-03-23T18:18:21.914+0000 7fe1c258c640 0 -- 192.168.123.104:0/3181771445 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x7fe1a005c9a0 msgr2=0x7fe1a007cda0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:18:44.706 INFO:tasks.workunit.client.0.vm04.stderr: Thick provisioning: 7% complete... Thick provisioning: 8% complete... Thick provisioning: 9% complete... Thick provisioning: 10% complete... Thick provisioning: 11% complete... Thick provisioning: 12% complete... Thick provisioning: 13% complete... Thick provisioning: 14% complete... Thick provisioning: 15% complete... Thick provisioning: 16% complete... Thick provisioning: 17% complete... Thick provisioning: 18% complete... Thick provisioning: 19% complete... Thick provisioning: 20% complete... Thick provisioning: 21% complete... Thick provisioning: 22% complete... Thick provisioning: 23% complete... Thick provisioning: 24% complete... Thick provisioning: 25% complete... Thick provisioning: 26% complete... Thick provisioning: 27% complete... Thick provisioning: 28% complete... Thick provisioning: 29% complete... Thick provisioning: 30% complete... Thick provisioning: 31% complete... Thick provisioning: 32% complete... Thick provisioning: 33% complete... Thick provisioning: 34% complete... Thick provisioning: 35% complete... Thick provisioning: 36% complete... Thick provisioning: 37% complete... Thick provisioning: 38% complete... Thick provisioning: 39% complete... Thick provisioning: 40% complete... Thick provisioning: 41% complete... Thick provisioning: 42% complete... Thick provisioning: 43% complete... Thick provisioning: 44% complete... Thick provisioning: 45% complete... Thick provisioning: 46% complete... Thick provisioning: 47% complete... Thick provisioning: 48% complete... Thick provisioning: 49% complete... Thick provisioning: 50% complete... Thick provisioning: 51% complete... Thick provisioning: 52% complete... Thick provisioning: 53% complete... Thick provisioning: 54% complete... Thick provisioning: 55% complete... Thick provisioning: 56% complete... Thick provisioning: 57% complete... Thick provisioning: 58% complete... Thick provisioning: 59% complete... Thick provisioning: 60% complete... Thick provisioning: 61% complete... Thick provisioning: 62% complete... Thick provisioning: 63% complete... Thick provisioning: 64% complete... Thick provisioning: 65% complete... Thick provisioning: 66% complete... Thick provisioning: 67% complete... Thick provisioning: 68% complete... Thick provisioning: 69% complete... Thick provisioning: 70% complete... Thick provisioning: 71% complete... Thick provisioning: 72% complete... Thick provisioning: 73% complete... Thick provisioning: 74% complete... Thick provisioning: 75% complete... Thick provisioning: 76% complete... Thick provisioning: 77% complete... Thick provisioning: 78% complete... Thick provisioning: 79% complete... Thick provisioning: 80% complete... Thick provisioning: 81% complete... Thick provisioning: 82% complete... Thick provisioning: 83% complete... Thick provisioning: 84% complete... Thick provisioning: 85% complete... Thick provisioning: 86% complete... Thick provisioning: 87% complete... Thick provisioning: 88% complete... Thick provisioning: 89% complete... Thick provisioning: 90% complete... Thick provisioning: 91% complete... Thick provisioning: 92% complete... Thick provisioning: 93% complete... Thick provisioning: 94% complete... Thick provisioning: 95% complete... Thick provisioning: 96% complete... Thick provisioning: 97% complete... Thick provisioning: 98% complete... Thick provisioning: 99% complete... Thick provisioning: 100% complete... Thick provisioning: 100% complete...done. 2026-03-23T18:18:44.716 INFO:tasks.workunit.client.0.vm04.stderr:+ count=0 2026-03-23T18:18:44.716 INFO:tasks.workunit.client.0.vm04.stderr:+ ret= 2026-03-23T18:18:44.716 INFO:tasks.workunit.client.0.vm04.stderr:+ '[' 0 -lt 10 ']' 2026-03-23T18:18:44.716 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd du 2026-03-23T18:18:44.716 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test1 2026-03-23T18:18:44.716 INFO:tasks.workunit.client.0.vm04.stderr:+ tr -s ' ' 2026-03-23T18:18:44.716 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^4 GiB' 2026-03-23T18:18:44.717 INFO:tasks.workunit.client.0.vm04.stderr:+ cut -d ' ' -f 4-5 2026-03-23T18:18:44.947 INFO:tasks.workunit.client.0.vm04.stdout:4 GiB 2026-03-23T18:18:44.947 INFO:tasks.workunit.client.0.vm04.stderr:+ ret=0 2026-03-23T18:18:44.947 INFO:tasks.workunit.client.0.vm04.stderr:+ '[' 0 = 0 ']' 2026-03-23T18:18:44.947 INFO:tasks.workunit.client.0.vm04.stderr:+ break 2026-03-23T18:18:44.947 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd du 2026-03-23T18:18:44.971 INFO:tasks.workunit.client.0.vm04.stdout:NAME PROVISIONED USED 2026-03-23T18:18:44.971 INFO:tasks.workunit.client.0.vm04.stdout:test1 4 GiB 4 GiB 2026-03-23T18:18:44.974 INFO:tasks.workunit.client.0.vm04.stderr:+ '[' 0 '!=' 0 ']' 2026-03-23T18:18:44.974 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test1 2026-03-23T18:18:45.053 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 1% complete... Removing image: 2% complete...2026-03-23T18:18:45.050+0000 7fe58fead640 0 -- 192.168.123.104:0/1343430366 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x5645d9019b50 msgr2=0x5645d904b680 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:18:45.080 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 3% complete...2026-03-23T18:18:45.078+0000 7fe58ec24640 0 -- 192.168.123.104:0/1343430366 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x5645d90a76b0 msgr2=0x5645d90c7b30 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:18:46.768 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 4% complete... Removing image: 5% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 8% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 11% complete... Removing image: 12% complete... Removing image: 13% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 16% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 19% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 22% complete... Removing image: 23% complete... Removing image: 24% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 27% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 30% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 33% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 36% complete... Removing image: 37% complete... Removing image: 38% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 41% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 44% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 47% complete... Removing image: 48% complete... Removing image: 49% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 52% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 55% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 58% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 61% complete... Removing image: 62% complete... Removing image: 63% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 66% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 69% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 72% complete... Removing image: 73% complete... Removing image: 74% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 77% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 80% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 83% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 86% complete... Removing image: 87% complete... Removing image: 88% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 91% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 94% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 97% complete... Removing image: 98% complete... Removing image: 99% complete... Removing image: 100% complete...done. 2026-03-23T18:18:46.771 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-23T18:18:46.771 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test1 2026-03-23T18:18:46.771 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:18:46.771 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^0$' 2026-03-23T18:18:46.793 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-23T18:18:46.793 INFO:tasks.workunit.client.0.vm04.stdout:testing import, export, resize, and snapshots... 2026-03-23T18:18:46.793 INFO:tasks.workunit.client.0.vm04.stderr:+ RBD_CREATE_ARGS='--image-format 2' 2026-03-23T18:18:46.793 INFO:tasks.workunit.client.0.vm04.stderr:+ test_others 2026-03-23T18:18:46.793 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing import, export, resize, and snapshots...' 2026-03-23T18:18:46.793 INFO:tasks.workunit.client.0.vm04.stderr:+ TMP_FILES='/tmp/img1 /tmp/img1.new /tmp/img2 /tmp/img2.new /tmp/img3 /tmp/img3.new /tmp/img-diff1.new /tmp/img-diff2.new /tmp/img-diff3.new /tmp/img1.snap1 /tmp/img1.snap1 /tmp/img-diff1.snap1' 2026-03-23T18:18:46.793 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-23T18:18:46.793 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:46.851 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:46.908 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:46.966 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:47.026 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:47.084 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:47.145 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:47.206 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:47.266 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:47.323 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:47.381 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:47.440 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:47.499 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:47.556 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:47.612 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:47.669 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:47.727 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:47.785 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:47.841 INFO:tasks.workunit.client.0.vm04.stderr:+ rm -f /tmp/img1 /tmp/img1.new /tmp/img2 /tmp/img2.new /tmp/img3 /tmp/img3.new /tmp/img-diff1.new /tmp/img-diff2.new /tmp/img-diff3.new /tmp/img1.snap1 /tmp/img1.snap1 /tmp/img-diff1.snap1 2026-03-23T18:18:47.842 INFO:tasks.workunit.client.0.vm04.stderr:+ dd if=/bin/sh of=/tmp/img1 bs=1k count=1 seek=10 2026-03-23T18:18:47.843 INFO:tasks.workunit.client.0.vm04.stderr:1+0 records in 2026-03-23T18:18:47.843 INFO:tasks.workunit.client.0.vm04.stderr:1+0 records out 2026-03-23T18:18:47.843 INFO:tasks.workunit.client.0.vm04.stderr:1024 bytes (1.0 kB, 1.0 KiB) copied, 3.17e-05 s, 32.3 MB/s 2026-03-23T18:18:47.843 INFO:tasks.workunit.client.0.vm04.stderr:+ dd if=/bin/dd of=/tmp/img1 bs=1k count=10 seek=100 2026-03-23T18:18:47.844 INFO:tasks.workunit.client.0.vm04.stderr:10+0 records in 2026-03-23T18:18:47.844 INFO:tasks.workunit.client.0.vm04.stderr:10+0 records out 2026-03-23T18:18:47.844 INFO:tasks.workunit.client.0.vm04.stderr:10240 bytes (10 kB, 10 KiB) copied, 4.6086e-05 s, 222 MB/s 2026-03-23T18:18:47.844 INFO:tasks.workunit.client.0.vm04.stderr:+ dd if=/bin/rm of=/tmp/img1 bs=1k count=100 seek=1000 2026-03-23T18:18:47.845 INFO:tasks.workunit.client.0.vm04.stderr:58+1 records in 2026-03-23T18:18:47.845 INFO:tasks.workunit.client.0.vm04.stderr:58+1 records out 2026-03-23T18:18:47.845 INFO:tasks.workunit.client.0.vm04.stderr:59912 bytes (60 kB, 59 KiB) copied, 0.000153046 s, 391 MB/s 2026-03-23T18:18:47.845 INFO:tasks.workunit.client.0.vm04.stderr:+ dd if=/bin/ls of=/tmp/img1 bs=1k seek=10000 2026-03-23T18:18:47.846 INFO:tasks.workunit.client.0.vm04.stderr:134+1 records in 2026-03-23T18:18:47.846 INFO:tasks.workunit.client.0.vm04.stderr:134+1 records out 2026-03-23T18:18:47.846 INFO:tasks.workunit.client.0.vm04.stderr:138216 bytes (138 kB, 135 KiB) copied, 0.000348684 s, 396 MB/s 2026-03-23T18:18:47.846 INFO:tasks.workunit.client.0.vm04.stderr:+ dd if=/bin/ln of=/tmp/img1 bs=1k seek=100000 2026-03-23T18:18:47.847 INFO:tasks.workunit.client.0.vm04.stderr:58+1 records in 2026-03-23T18:18:47.847 INFO:tasks.workunit.client.0.vm04.stderr:58+1 records out 2026-03-23T18:18:47.847 INFO:tasks.workunit.client.0.vm04.stderr:59912 bytes (60 kB, 59 KiB) copied, 0.0001494 s, 401 MB/s 2026-03-23T18:18:47.847 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd import --image-format 2 /tmp/img1 testimg1 2026-03-23T18:18:47.951 INFO:tasks.workunit.client.0.vm04.stderr: Importing image: 4% complete... Importing image: 8% complete... Importing image: 12% complete... Importing image: 16% complete... Importing image: 20% complete... Importing image: 24% complete... Importing image: 28% complete... Importing image: 32% complete... Importing image: 36% complete... Importing image: 40% complete... Importing image: 45% complete... Importing image: 49% complete... Importing image: 53% complete... Importing image: 57% complete... Importing image: 61% complete... Importing image: 65% complete... Importing image: 69% complete... Importing image: 73% complete... Importing image: 77% complete... Importing image: 81% complete... Importing image: 85% complete... Importing image: 90% complete... Importing image: 94% complete... Importing image: 98% complete... Importing image: 100% complete...done. 2026-03-23T18:18:47.956 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd resize testimg1 --size=256 --allow-shrink 2026-03-23T18:18:47.985 INFO:tasks.workunit.client.0.vm04.stderr: Resizing image: 100% complete...done. 2026-03-23T18:18:47.991 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export testimg1 /tmp/img2 2026-03-23T18:18:48.059 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 1% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 12% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 23% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 37% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 48% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 62% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 73% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 87% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 98% complete... Exporting image: 100% complete...done. 2026-03-23T18:18:48.065 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create testimg1 --snap=snap1 2026-03-23T18:18:48.139 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-23T18:18:48.145 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd resize testimg1 --size=128 2026-03-23T18:18:48.169 INFO:tasks.workunit.client.0.vm04.stderr:rbd: shrinking an image is only allowed with the --allow-shrink flag 2026-03-23T18:18:48.173 INFO:tasks.workunit.client.0.vm04.stderr:+ true 2026-03-23T18:18:48.173 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd resize testimg1 --size=128 --allow-shrink 2026-03-23T18:18:48.202 INFO:tasks.workunit.client.0.vm04.stderr: Resizing image: 50% complete... Resizing image: 51% complete... Resizing image: 53% complete... Resizing image: 54% complete... Resizing image: 56% complete... Resizing image: 57% complete... Resizing image: 59% complete... Resizing image: 60% complete... Resizing image: 62% complete... Resizing image: 64% complete... Resizing image: 65% complete... Resizing image: 67% complete... Resizing image: 68% complete... Resizing image: 70% complete... Resizing image: 71% complete... Resizing image: 73% complete... Resizing image: 75% complete... Resizing image: 76% complete... Resizing image: 78% complete... Resizing image: 79% complete... Resizing image: 81% complete... Resizing image: 82% complete... Resizing image: 84% complete... Resizing image: 85% complete... Resizing image: 87% complete... Resizing image: 89% complete... Resizing image: 90% complete... Resizing image: 92% complete... Resizing image: 93% complete... Resizing image: 95% complete... Resizing image: 96% complete... Resizing image: 98% complete... Resizing image: 100% complete...done. 2026-03-23T18:18:48.207 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export testimg1 /tmp/img3 2026-03-23T18:18:48.260 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 3% complete... Exporting image: 6% complete... Exporting image: 9% complete... Exporting image: 12% complete... Exporting image: 15% complete... Exporting image: 18% complete... Exporting image: 21% complete... Exporting image: 25% complete... Exporting image: 28% complete... Exporting image: 31% complete... Exporting image: 34% complete... Exporting image: 37% complete... Exporting image: 40% complete... Exporting image: 43% complete... Exporting image: 46% complete... Exporting image: 50% complete... Exporting image: 53% complete... Exporting image: 56% complete... Exporting image: 59% complete... Exporting image: 62% complete... Exporting image: 65% complete... Exporting image: 68% complete... Exporting image: 71% complete... Exporting image: 75% complete... Exporting image: 78% complete... Exporting image: 81% complete... Exporting image: 84% complete... Exporting image: 87% complete... Exporting image: 90% complete... Exporting image: 93% complete... Exporting image: 96% complete... Exporting image: 100% complete...done. 2026-03-23T18:18:48.264 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg1 2026-03-23T18:18:48.264 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 128 MiB' 2026-03-23T18:18:48.290 INFO:tasks.workunit.client.0.vm04.stdout: size 128 MiB in 32 objects 2026-03-23T18:18:48.290 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info --snap=snap1 testimg1 2026-03-23T18:18:48.290 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 256 MiB' 2026-03-23T18:18:48.316 INFO:tasks.workunit.client.0.vm04.stdout: size 256 MiB in 64 objects 2026-03-23T18:18:48.316 INFO:tasks.workunit.client.0.vm04.stderr:+ rm -rf /tmp/diff-testimg1-1 /tmp/diff-testimg1-2 2026-03-23T18:18:48.317 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export-diff testimg1 --snap=snap1 /tmp/diff-testimg1-1 2026-03-23T18:18:48.351 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 3% complete... Exporting image: 37% complete... Exporting image: 100% complete...done. 2026-03-23T18:18:48.355 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export-diff testimg1 --from-snap=snap1 /tmp/diff-testimg1-2 2026-03-23T18:18:48.379 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 100% complete...done. 2026-03-23T18:18:48.382 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size=1 testimg-diff1 2026-03-23T18:18:48.413 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd import-diff --sparse-size 8K /tmp/diff-testimg1-1 testimg-diff1 2026-03-23T18:18:49.146 INFO:tasks.workunit.client.0.vm04.stderr: Importing image diff: 22% complete... Importing image diff: 63% complete... Importing image diff: 99% complete... Importing image diff: 100% complete...done. 2026-03-23T18:18:49.153 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd import-diff --sparse-size 8K /tmp/diff-testimg1-2 testimg-diff1 2026-03-23T18:18:49.185 INFO:tasks.workunit.client.0.vm04.stderr: Importing image diff: 68% complete... Importing image diff: 96% complete... Importing image diff: 100% complete...done. 2026-03-23T18:18:49.191 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg1 2026-03-23T18:18:49.191 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 128 MiB' 2026-03-23T18:18:49.420 INFO:tasks.workunit.client.0.vm04.stdout: size 128 MiB in 32 objects 2026-03-23T18:18:49.420 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info --snap=snap1 testimg1 2026-03-23T18:18:49.420 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 256 MiB' 2026-03-23T18:18:49.448 INFO:tasks.workunit.client.0.vm04.stdout: size 256 MiB in 64 objects 2026-03-23T18:18:49.448 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg-diff1 2026-03-23T18:18:49.448 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 128 MiB' 2026-03-23T18:18:49.475 INFO:tasks.workunit.client.0.vm04.stdout: size 128 MiB in 32 objects 2026-03-23T18:18:49.475 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info --snap=snap1 testimg-diff1 2026-03-23T18:18:49.475 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 256 MiB' 2026-03-23T18:18:49.505 INFO:tasks.workunit.client.0.vm04.stdout: size 256 MiB in 64 objects 2026-03-23T18:18:49.505 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd copy testimg1 --snap=snap1 testimg2 2026-03-23T18:18:49.586 INFO:tasks.workunit.client.0.vm04.stderr: Image copy: 3% complete... Image copy: 37% complete... Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-23T18:18:49.590 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd copy testimg1 testimg3 2026-03-23T18:18:49.672 INFO:tasks.workunit.client.0.vm04.stderr: Image copy: 3% complete... Image copy: 6% complete... Image copy: 9% complete... Image copy: 12% complete... Image copy: 15% complete... Image copy: 18% complete... Image copy: 21% complete... Image copy: 25% complete... Image copy: 28% complete... Image copy: 31% complete... Image copy: 34% complete... Image copy: 37% complete... Image copy: 40% complete... Image copy: 43% complete... Image copy: 46% complete... Image copy: 50% complete... Image copy: 53% complete... Image copy: 56% complete... Image copy: 59% complete... Image copy: 62% complete... Image copy: 65% complete... Image copy: 68% complete... Image copy: 71% complete... Image copy: 75% complete... Image copy: 78% complete... Image copy: 81% complete... Image copy: 84% complete... Image copy: 87% complete... Image copy: 90% complete... Image copy: 93% complete... Image copy: 96% complete... Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-23T18:18:49.678 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd copy testimg-diff1 --sparse-size 768K --snap=snap1 testimg-diff2 2026-03-23T18:18:49.756 INFO:tasks.workunit.client.0.vm04.stderr: Image copy: 3% complete... Image copy: 37% complete... Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-23T18:18:49.761 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd copy testimg-diff1 --sparse-size 768K testimg-diff3 2026-03-23T18:18:49.842 INFO:tasks.workunit.client.0.vm04.stderr: Image copy: 3% complete... Image copy: 6% complete... Image copy: 9% complete... Image copy: 12% complete... Image copy: 15% complete... Image copy: 18% complete... Image copy: 21% complete... Image copy: 25% complete... Image copy: 28% complete... Image copy: 31% complete... Image copy: 34% complete... Image copy: 37% complete... Image copy: 40% complete... Image copy: 43% complete... Image copy: 46% complete... Image copy: 50% complete... Image copy: 53% complete... Image copy: 56% complete... Image copy: 59% complete... Image copy: 62% complete... Image copy: 65% complete... Image copy: 68% complete... Image copy: 71% complete... Image copy: 75% complete... Image copy: 78% complete... Image copy: 81% complete... Image copy: 84% complete... Image copy: 87% complete... Image copy: 90% complete... Image copy: 93% complete... Image copy: 96% complete... Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-23T18:18:49.847 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg2 2026-03-23T18:18:49.847 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 256 MiB' 2026-03-23T18:18:49.873 INFO:tasks.workunit.client.0.vm04.stdout: size 256 MiB in 64 objects 2026-03-23T18:18:49.873 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg3 2026-03-23T18:18:49.873 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 128 MiB' 2026-03-23T18:18:49.900 INFO:tasks.workunit.client.0.vm04.stdout: size 128 MiB in 32 objects 2026-03-23T18:18:49.900 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg-diff2 2026-03-23T18:18:49.901 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 256 MiB' 2026-03-23T18:18:49.927 INFO:tasks.workunit.client.0.vm04.stdout: size 256 MiB in 64 objects 2026-03-23T18:18:49.927 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg-diff3 2026-03-23T18:18:49.927 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 128 MiB' 2026-03-23T18:18:49.954 INFO:tasks.workunit.client.0.vm04.stdout: size 128 MiB in 32 objects 2026-03-23T18:18:49.954 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd deep copy testimg1 testimg4 2026-03-23T18:18:50.181 INFO:tasks.workunit.client.0.vm04.stderr: Image deep copy: 1% complete... Image deep copy: 3% complete... Image deep copy: 4% complete... Image deep copy: 6% complete... Image deep copy: 7% complete... Image deep copy: 9% complete... Image deep copy: 10% complete... Image deep copy: 12% complete... Image deep copy: 14% complete... Image deep copy: 15% complete... Image deep copy: 17% complete... Image deep copy: 18% complete... Image deep copy: 20% complete... Image deep copy: 21% complete... Image deep copy: 23% complete... Image deep copy: 25% complete... Image deep copy: 26% complete... Image deep copy: 28% complete... Image deep copy: 29% complete... Image deep copy: 31% complete... Image deep copy: 32% complete... Image deep copy: 34% complete... Image deep copy: 35% complete... Image deep copy: 37% complete... Image deep copy: 39% complete... Image deep copy: 40% complete... Image deep copy: 42% complete... Image deep copy: 43% complete... Image deep copy: 45% complete... Image deep copy: 46% complete... Image deep copy: 48% complete... Image deep copy: 50% complete... Image deep copy: 51% complete... Image deep copy: 53% complete... Image deep copy: 54% complete... Image deep copy: 56% complete... Image deep copy: 57% complete... Image deep copy: 59% complete... Image deep copy: 60% complete... Image deep copy: 62% complete... Image deep copy: 64% complete... Image deep copy: 65% complete... Image deep copy: 67% complete... Image deep copy: 68% complete... Image deep copy: 70% complete... Image deep copy: 71% complete... Image deep copy: 73% complete... Image deep copy: 75% complete... Image deep copy: 76% complete... Image deep copy: 78% complete... Image deep copy: 79% complete... Image deep copy: 81% complete... Image deep copy: 82% complete... Image deep copy: 84% complete... Image deep copy: 85% complete... Image deep copy: 87% complete... Image deep copy: 89% complete... Image deep copy: 90% complete... Image deep copy: 92% complete... Image deep copy: 93% complete... Image deep copy: 95% complete... Image deep copy: 96% complete... Image deep copy: 98% complete... Image deep copy: 100% complete... Image deep copy: 100% complete...done. 2026-03-23T18:18:50.185 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd deep copy testimg1 --snap=snap1 testimg5 2026-03-23T18:18:51.186 INFO:tasks.workunit.client.0.vm04.stderr: Image deep copy: 1% complete... Image deep copy: 3% complete... Image deep copy: 4% complete... Image deep copy: 6% complete... Image deep copy: 7% complete... Image deep copy: 9% complete... Image deep copy: 10% complete... Image deep copy: 12% complete... Image deep copy: 14% complete... Image deep copy: 15% complete... Image deep copy: 17% complete... Image deep copy: 18% complete... Image deep copy: 20% complete... Image deep copy: 21% complete... Image deep copy: 23% complete... Image deep copy: 25% complete... Image deep copy: 26% complete... Image deep copy: 28% complete... Image deep copy: 29% complete... Image deep copy: 31% complete... Image deep copy: 32% complete... Image deep copy: 34% complete... Image deep copy: 35% complete... Image deep copy: 37% complete... Image deep copy: 39% complete... Image deep copy: 40% complete... Image deep copy: 42% complete... Image deep copy: 43% complete... Image deep copy: 45% complete... Image deep copy: 46% complete... Image deep copy: 48% complete... Image deep copy: 50% complete... Image deep copy: 51% complete... Image deep copy: 53% complete... Image deep copy: 54% complete... Image deep copy: 56% complete... Image deep copy: 57% complete... Image deep copy: 59% complete... Image deep copy: 60% complete... Image deep copy: 62% complete... Image deep copy: 64% complete... Image deep copy: 65% complete... Image deep copy: 67% complete... Image deep copy: 68% complete... Image deep copy: 70% complete... Image deep copy: 71% complete... Image deep copy: 73% complete... Image deep copy: 75% complete... Image deep copy: 76% complete... Image deep copy: 78% complete... Image deep copy: 79% complete... Image deep copy: 81% complete... Image deep copy: 82% complete... Image deep copy: 84% complete... Image deep copy: 85% complete... Image deep copy: 87% complete... Image deep copy: 89% complete... Image deep copy: 90% complete... Image deep copy: 92% complete... Image deep copy: 93% complete... Image deep copy: 95% complete... Image deep copy: 96% complete... Image deep copy: 98% complete... Image deep copy: 100% complete... Image deep copy: 100% complete...done. 2026-03-23T18:18:51.190 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg4 2026-03-23T18:18:51.190 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 128 MiB' 2026-03-23T18:18:51.217 INFO:tasks.workunit.client.0.vm04.stdout: size 128 MiB in 32 objects 2026-03-23T18:18:51.217 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg5 2026-03-23T18:18:51.218 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 256 MiB' 2026-03-23T18:18:51.244 INFO:tasks.workunit.client.0.vm04.stdout: size 256 MiB in 64 objects 2026-03-23T18:18:51.245 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap ls testimg4 2026-03-23T18:18:51.245 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -v SNAPID 2026-03-23T18:18:51.245 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:18:51.245 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 1 2026-03-23T18:18:51.271 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-23T18:18:51.271 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap ls testimg4 2026-03-23T18:18:51.271 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '.*snap1.*' 2026-03-23T18:18:51.299 INFO:tasks.workunit.client.0.vm04.stdout: 16 snap1 256 MiB Mon Mar 23 18:18:50 2026 2026-03-23T18:18:51.299 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export testimg1 /tmp/img1.new 2026-03-23T18:18:51.354 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 3% complete... Exporting image: 6% complete... Exporting image: 9% complete... Exporting image: 12% complete... Exporting image: 15% complete... Exporting image: 18% complete... Exporting image: 21% complete... Exporting image: 25% complete... Exporting image: 28% complete... Exporting image: 31% complete... Exporting image: 34% complete... Exporting image: 37% complete... Exporting image: 40% complete... Exporting image: 43% complete... Exporting image: 46% complete... Exporting image: 50% complete... Exporting image: 53% complete... Exporting image: 56% complete... Exporting image: 59% complete... Exporting image: 62% complete... Exporting image: 65% complete... Exporting image: 68% complete... Exporting image: 71% complete... Exporting image: 75% complete... Exporting image: 78% complete... Exporting image: 81% complete... Exporting image: 84% complete... Exporting image: 87% complete... Exporting image: 90% complete... Exporting image: 93% complete... Exporting image: 96% complete... Exporting image: 100% complete...done. 2026-03-23T18:18:51.359 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export testimg2 /tmp/img2.new 2026-03-23T18:18:51.431 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 1% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 12% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 23% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 37% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 48% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 62% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 73% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 87% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 98% complete... Exporting image: 100% complete...done. 2026-03-23T18:18:51.436 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export testimg3 /tmp/img3.new 2026-03-23T18:18:51.495 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 3% complete... Exporting image: 6% complete... Exporting image: 9% complete... Exporting image: 12% complete... Exporting image: 15% complete... Exporting image: 18% complete... Exporting image: 21% complete... Exporting image: 25% complete... Exporting image: 28% complete... Exporting image: 31% complete... Exporting image: 34% complete... Exporting image: 37% complete... Exporting image: 40% complete... Exporting image: 43% complete... Exporting image: 46% complete... Exporting image: 50% complete... Exporting image: 53% complete... Exporting image: 56% complete... Exporting image: 59% complete... Exporting image: 62% complete... Exporting image: 65% complete... Exporting image: 68% complete... Exporting image: 71% complete... Exporting image: 75% complete... Exporting image: 78% complete... Exporting image: 81% complete... Exporting image: 84% complete... Exporting image: 87% complete... Exporting image: 90% complete... Exporting image: 93% complete... Exporting image: 96% complete... Exporting image: 100% complete...done. 2026-03-23T18:18:51.499 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export testimg-diff1 /tmp/img-diff1.new 2026-03-23T18:18:51.554 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 3% complete... Exporting image: 6% complete... Exporting image: 9% complete... Exporting image: 12% complete... Exporting image: 15% complete... Exporting image: 18% complete... Exporting image: 21% complete... Exporting image: 25% complete... Exporting image: 28% complete... Exporting image: 31% complete... Exporting image: 34% complete... Exporting image: 37% complete... Exporting image: 40% complete... Exporting image: 43% complete... Exporting image: 46% complete... Exporting image: 50% complete... Exporting image: 53% complete... Exporting image: 56% complete... Exporting image: 59% complete... Exporting image: 62% complete... Exporting image: 65% complete... Exporting image: 68% complete... Exporting image: 71% complete... Exporting image: 75% complete... Exporting image: 78% complete... Exporting image: 81% complete... Exporting image: 84% complete... Exporting image: 87% complete... Exporting image: 90% complete... Exporting image: 93% complete... Exporting image: 96% complete... Exporting image: 100% complete...done. 2026-03-23T18:18:51.559 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export testimg-diff2 /tmp/img-diff2.new 2026-03-23T18:18:51.624 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 1% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 12% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 23% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 37% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 48% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 62% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 73% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 87% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 98% complete... Exporting image: 100% complete...done. 2026-03-23T18:18:51.629 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export testimg-diff3 /tmp/img-diff3.new 2026-03-23T18:18:51.687 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 3% complete... Exporting image: 6% complete... Exporting image: 9% complete... Exporting image: 12% complete... Exporting image: 15% complete... Exporting image: 18% complete... Exporting image: 21% complete... Exporting image: 25% complete... Exporting image: 28% complete... Exporting image: 31% complete... Exporting image: 34% complete... Exporting image: 37% complete... Exporting image: 40% complete... Exporting image: 43% complete... Exporting image: 46% complete... Exporting image: 50% complete... Exporting image: 53% complete... Exporting image: 56% complete... Exporting image: 59% complete... Exporting image: 62% complete... Exporting image: 65% complete... Exporting image: 68% complete... Exporting image: 71% complete... Exporting image: 75% complete... Exporting image: 78% complete... Exporting image: 81% complete... Exporting image: 84% complete... Exporting image: 87% complete... Exporting image: 90% complete... Exporting image: 93% complete... Exporting image: 96% complete... Exporting image: 100% complete...done. 2026-03-23T18:18:51.693 INFO:tasks.workunit.client.0.vm04.stderr:+ cmp /tmp/img2 /tmp/img2.new 2026-03-23T18:18:51.828 INFO:tasks.workunit.client.0.vm04.stderr:+ cmp /tmp/img3 /tmp/img3.new 2026-03-23T18:18:51.894 INFO:tasks.workunit.client.0.vm04.stderr:+ cmp /tmp/img2 /tmp/img-diff2.new 2026-03-23T18:18:52.000 INFO:tasks.workunit.client.0.vm04.stderr:+ cmp /tmp/img3 /tmp/img-diff3.new 2026-03-23T18:18:52.112 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rollback --snap=snap1 testimg1 2026-03-23T18:18:52.178 INFO:tasks.workunit.client.0.vm04.stderr: Rolling back to snapshot: 1% complete... Rolling back to snapshot: 3% complete... Rolling back to snapshot: 4% complete... Rolling back to snapshot: 6% complete... Rolling back to snapshot: 7% complete... Rolling back to snapshot: 9% complete... Rolling back to snapshot: 10% complete... Rolling back to snapshot: 12% complete... Rolling back to snapshot: 14% complete... Rolling back to snapshot: 15% complete... Rolling back to snapshot: 17% complete... Rolling back to snapshot: 18% complete... Rolling back to snapshot: 20% complete... Rolling back to snapshot: 21% complete... Rolling back to snapshot: 23% complete... Rolling back to snapshot: 25% complete... Rolling back to snapshot: 26% complete... Rolling back to snapshot: 28% complete... Rolling back to snapshot: 29% complete... Rolling back to snapshot: 31% complete... Rolling back to snapshot: 32% complete... Rolling back to snapshot: 34% complete... Rolling back to snapshot: 35% complete... Rolling back to snapshot: 37% complete... Rolling back to snapshot: 39% complete... Rolling back to snapshot: 40% complete... Rolling back to snapshot: 42% complete... Rolling back to snapshot: 43% complete... Rolling back to snapshot: 45% complete... Rolling back to snapshot: 46% complete... Rolling back to snapshot: 48% complete... Rolling back to snapshot: 50% complete... Rolling back to snapshot: 51% complete... Rolling back to snapshot: 53% complete... Rolling back to snapshot: 54% complete... Rolling back to snapshot: 56% complete... Rolling back to snapshot: 57% complete... Rolling back to snapshot: 59% complete... Rolling back to snapshot: 60% complete... Rolling back to snapshot: 62% complete... Rolling back to snapshot: 64% complete... Rolling back to snapshot: 65% complete... Rolling back to snapshot: 67% complete... Rolling back to snapshot: 68% complete... Rolling back to snapshot: 70% complete... Rolling back to snapshot: 71% complete... Rolling back to snapshot: 73% complete... Rolling back to snapshot: 75% complete... Rolling back to snapshot: 76% complete... Rolling back to snapshot: 78% complete... Rolling back to snapshot: 79% complete... Rolling back to snapshot: 81% complete... Rolling back to snapshot: 82% complete... Rolling back to snapshot: 84% complete... Rolling back to snapshot: 85% complete... Rolling back to snapshot: 87% complete... Rolling back to snapshot: 89% complete... Rolling back to snapshot: 90% complete... Rolling back to snapshot: 92% complete... Rolling back to snapshot: 93% complete... Rolling back to snapshot: 95% complete... Rolling back to snapshot: 96% complete... Rolling back to snapshot: 98% complete... Rolling back to snapshot: 100% complete...done. 2026-03-23T18:18:52.186 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rollback --snap=snap1 testimg-diff1 2026-03-23T18:18:52.257 INFO:tasks.workunit.client.0.vm04.stderr: Rolling back to snapshot: 1% complete... Rolling back to snapshot: 3% complete... Rolling back to snapshot: 4% complete... Rolling back to snapshot: 6% complete... Rolling back to snapshot: 7% complete... Rolling back to snapshot: 9% complete... Rolling back to snapshot: 10% complete... Rolling back to snapshot: 12% complete... Rolling back to snapshot: 14% complete... Rolling back to snapshot: 15% complete... Rolling back to snapshot: 17% complete... Rolling back to snapshot: 18% complete... Rolling back to snapshot: 20% complete... Rolling back to snapshot: 21% complete... Rolling back to snapshot: 23% complete... Rolling back to snapshot: 25% complete... Rolling back to snapshot: 26% complete... Rolling back to snapshot: 28% complete... Rolling back to snapshot: 29% complete... Rolling back to snapshot: 31% complete... Rolling back to snapshot: 32% complete... Rolling back to snapshot: 34% complete... Rolling back to snapshot: 35% complete... Rolling back to snapshot: 37% complete... Rolling back to snapshot: 39% complete... Rolling back to snapshot: 40% complete... Rolling back to snapshot: 42% complete... Rolling back to snapshot: 43% complete... Rolling back to snapshot: 45% complete... Rolling back to snapshot: 46% complete... Rolling back to snapshot: 48% complete... Rolling back to snapshot: 50% complete... Rolling back to snapshot: 51% complete... Rolling back to snapshot: 53% complete... Rolling back to snapshot: 54% complete... Rolling back to snapshot: 56% complete... Rolling back to snapshot: 57% complete... Rolling back to snapshot: 59% complete... Rolling back to snapshot: 60% complete... Rolling back to snapshot: 62% complete... Rolling back to snapshot: 64% complete... Rolling back to snapshot: 65% complete... Rolling back to snapshot: 67% complete... Rolling back to snapshot: 68% complete... Rolling back to snapshot: 70% complete... Rolling back to snapshot: 71% complete... Rolling back to snapshot: 73% complete... Rolling back to snapshot: 75% complete... Rolling back to snapshot: 76% complete... Rolling back to snapshot: 78% complete... Rolling back to snapshot: 79% complete... Rolling back to snapshot: 81% complete... Rolling back to snapshot: 82% complete... Rolling back to snapshot: 84% complete... Rolling back to snapshot: 85% complete... Rolling back to snapshot: 87% complete... Rolling back to snapshot: 89% complete... Rolling back to snapshot: 90% complete... Rolling back to snapshot: 92% complete... Rolling back to snapshot: 93% complete... Rolling back to snapshot: 95% complete... Rolling back to snapshot: 96% complete... Rolling back to snapshot: 98% complete... Rolling back to snapshot: 100% complete...done. 2026-03-23T18:18:52.265 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 256 MiB' 2026-03-23T18:18:52.265 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg1 2026-03-23T18:18:52.297 INFO:tasks.workunit.client.0.vm04.stdout: size 256 MiB in 64 objects 2026-03-23T18:18:52.297 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg-diff1 2026-03-23T18:18:52.297 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 256 MiB' 2026-03-23T18:18:52.326 INFO:tasks.workunit.client.0.vm04.stdout: size 256 MiB in 64 objects 2026-03-23T18:18:52.326 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export testimg1 /tmp/img1.snap1 2026-03-23T18:18:52.483 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 1% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 12% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 23% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 37% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 48% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 62% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 73% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 87% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 98% complete... Exporting image: 100% complete...done. 2026-03-23T18:18:52.493 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export testimg-diff1 /tmp/img-diff1.snap1 2026-03-23T18:18:52.596 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 1% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 12% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 23% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 37% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 48% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 62% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 73% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 87% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 98% complete... Exporting image: 100% complete...done. 2026-03-23T18:18:52.601 INFO:tasks.workunit.client.0.vm04.stderr:+ cmp /tmp/img2 /tmp/img1.snap1 2026-03-23T18:18:53.152 INFO:tasks.workunit.client.0.vm04.stderr:+ cmp /tmp/img2 /tmp/img-diff1.snap1 2026-03-23T18:18:53.727 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm testimg2 2026-03-23T18:18:53.820 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 1% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 12% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 23% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 37% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 48% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 62% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 73% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 87% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 98% complete...2026-03-23T18:18:53.818+0000 7fdcdf891640 0 -- 192.168.123.104:0/2136281944 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x55f761ad7930 msgr2=0x55f761af7db0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:18:53.823 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:18:53.828 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm testimg3 2026-03-23T18:18:53.926 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete...2026-03-23T18:18:53.922+0000 7fc9721a4640 0 -- 192.168.123.104:0/3215133202 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7fc94c008d30 msgr2=0x7fc94c0291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:18:53.934 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:18:53.939 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create testimg2 -s 0 2026-03-23T18:18:53.975 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd cp testimg2 testimg3 2026-03-23T18:18:54.024 INFO:tasks.workunit.client.0.vm04.stderr: Image copy: 100% complete...done. 2026-03-23T18:18:54.028 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd deep cp testimg2 testimg6 2026-03-23T18:18:54.073 INFO:tasks.workunit.client.0.vm04.stderr: Image deep copy: 100% complete...done. 2026-03-23T18:18:54.077 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm --snap=snap1 testimg1 2026-03-23T18:18:54.180 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-23T18:18:54.188 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm --snap=snap1 testimg-diff1 2026-03-23T18:18:55.182 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-23T18:18:55.194 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info --snap=snap1 testimg1 2026-03-23T18:18:55.194 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'error setting snapshot context: (2) No such file or directory' 2026-03-23T18:18:55.220 INFO:tasks.workunit.client.0.vm04.stdout:error setting snapshot context: (2) No such file or directory 2026-03-23T18:18:55.220 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info --snap=snap1 testimg-diff1 2026-03-23T18:18:55.220 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'error setting snapshot context: (2) No such file or directory' 2026-03-23T18:18:55.253 INFO:tasks.workunit.client.0.vm04.stdout:error setting snapshot context: (2) No such file or directory 2026-03-23T18:18:55.253 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd sparsify testimg1 2026-03-23T18:18:55.303 INFO:tasks.workunit.client.0.vm04.stderr: Image sparsify: 1% complete... Image sparsify: 3% complete... Image sparsify: 4% complete... Image sparsify: 6% complete... Image sparsify: 7% complete... Image sparsify: 9% complete... Image sparsify: 10% complete... Image sparsify: 12% complete... Image sparsify: 14% complete... Image sparsify: 15% complete... Image sparsify: 17% complete... Image sparsify: 18% complete... Image sparsify: 20% complete... Image sparsify: 21% complete... Image sparsify: 23% complete... Image sparsify: 25% complete... Image sparsify: 26% complete... Image sparsify: 28% complete... Image sparsify: 29% complete... Image sparsify: 31% complete... Image sparsify: 32% complete... Image sparsify: 34% complete... Image sparsify: 35% complete... Image sparsify: 37% complete... Image sparsify: 39% complete... Image sparsify: 40% complete... Image sparsify: 42% complete... Image sparsify: 43% complete... Image sparsify: 45% complete... Image sparsify: 46% complete... Image sparsify: 48% complete... Image sparsify: 50% complete... Image sparsify: 51% complete... Image sparsify: 53% complete... Image sparsify: 54% complete... Image sparsify: 56% complete... Image sparsify: 57% complete... Image sparsify: 59% complete... Image sparsify: 60% complete... Image sparsify: 62% complete... Image sparsify: 64% complete... Image sparsify: 65% complete... Image sparsify: 67% complete... Image sparsify: 68% complete... Image sparsify: 70% complete... Image sparsify: 71% complete... Image sparsify: 73% complete... Image sparsify: 75% complete... Image sparsify: 76% complete... Image sparsify: 78% complete... Image sparsify: 79% complete... Image sparsify: 81% complete... Image sparsify: 82% complete... Image sparsify: 84% complete... Image sparsify: 85% complete... Image sparsify: 87% complete... Image sparsify: 89% complete... Image sparsify: 90% complete... Image sparsify: 92% complete... Image sparsify: 93% complete... Image sparsify: 95% complete... Image sparsify: 96% complete... Image sparsify: 98% complete... Image sparsify: 100% complete...done. 2026-03-23T18:18:55.311 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-23T18:18:55.311 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:55.427 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:55.539 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:55.657 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:56.287 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:57.348 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:57.548 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:57.655 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:57.763 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:58.078 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:58.200 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:58.786 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:58.914 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:58.983 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:59.066 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:59.136 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:59.208 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:59.283 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:59.376 INFO:tasks.workunit.client.0.vm04.stderr:+ rm -f /tmp/img1 /tmp/img1.new /tmp/img2 /tmp/img2.new /tmp/img3 /tmp/img3.new /tmp/img-diff1.new /tmp/img-diff2.new /tmp/img-diff3.new /tmp/img1.snap1 /tmp/img1.snap1 /tmp/img-diff1.snap1 2026-03-23T18:18:59.522 INFO:tasks.workunit.client.0.vm04.stdout:testing locking... 2026-03-23T18:18:59.522 INFO:tasks.workunit.client.0.vm04.stderr:+ test_locking 2026-03-23T18:18:59.522 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing locking...' 2026-03-23T18:18:59.522 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-23T18:18:59.522 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:59.606 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:59.687 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:59.761 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:59.834 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:59.916 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:18:59.994 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:00.091 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:00.176 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:00.261 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:00.337 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:00.413 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:00.521 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:00.605 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:00.680 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:00.958 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:01.032 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:01.103 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:01.372 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 -s 1 test1 2026-03-23T18:19:01.415 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock list test1 2026-03-23T18:19:01.415 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:01.415 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^0$' 2026-03-23T18:19:01.449 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-23T18:19:01.449 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock add test1 id 2026-03-23T18:19:01.486 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock list test1 2026-03-23T18:19:01.486 INFO:tasks.workunit.client.0.vm04.stderr:+ grep ' 1 ' 2026-03-23T18:19:01.521 INFO:tasks.workunit.client.0.vm04.stdout:There is 1 exclusive lock on this image. 2026-03-23T18:19:01.522 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd lock list test1 2026-03-23T18:19:01.522 INFO:tasks.workunit.client.0.vm04.stderr:++ tail -n 1 2026-03-23T18:19:01.522 INFO:tasks.workunit.client.0.vm04.stderr:++ awk '{print $1;}' 2026-03-23T18:19:01.552 INFO:tasks.workunit.client.0.vm04.stderr:+ LOCKER=client.8441 2026-03-23T18:19:01.552 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock remove test1 id client.8441 2026-03-23T18:19:02.127 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock list test1 2026-03-23T18:19:02.127 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:02.127 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^0$' 2026-03-23T18:19:02.163 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-23T18:19:02.163 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock add test1 id --shared tag 2026-03-23T18:19:02.204 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock list test1 2026-03-23T18:19:02.204 INFO:tasks.workunit.client.0.vm04.stderr:+ grep ' 1 ' 2026-03-23T18:19:02.234 INFO:tasks.workunit.client.0.vm04.stdout:There is 1 shared lock on this image. 2026-03-23T18:19:02.234 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock add test1 id --shared tag 2026-03-23T18:19:02.270 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock list test1 2026-03-23T18:19:02.270 INFO:tasks.workunit.client.0.vm04.stderr:+ grep ' 2 ' 2026-03-23T18:19:02.302 INFO:tasks.workunit.client.0.vm04.stdout:There are 2 shared locks on this image. 2026-03-23T18:19:02.302 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock add test1 id2 --shared tag 2026-03-23T18:19:02.341 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock list test1 2026-03-23T18:19:02.341 INFO:tasks.workunit.client.0.vm04.stderr:+ grep ' 3 ' 2026-03-23T18:19:02.371 INFO:tasks.workunit.client.0.vm04.stdout:There are 3 shared locks on this image. 2026-03-23T18:19:02.371 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock list test1 2026-03-23T18:19:02.371 INFO:tasks.workunit.client.0.vm04.stderr:+ tail -n 1 2026-03-23T18:19:02.372 INFO:tasks.workunit.client.0.vm04.stderr:+ awk '{print $2, $1;}' 2026-03-23T18:19:02.372 INFO:tasks.workunit.client.0.vm04.stderr:+ xargs rbd lock remove test1 2026-03-23T18:19:03.134 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info test1 2026-03-23T18:19:03.134 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -qE 'features:.*exclusive' 2026-03-23T18:19:03.167 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd lock list test1 2026-03-23T18:19:03.200 INFO:tasks.workunit.client.0.vm04.stderr:+ '[' -n 'There are 2 shared locks on this image. 2026-03-23T18:19:03.200 INFO:tasks.workunit.client.0.vm04.stderr:Lock tag: tag 2026-03-23T18:19:03.201 INFO:tasks.workunit.client.0.vm04.stderr:Locker ID Address 2026-03-23T18:19:03.201 INFO:tasks.workunit.client.0.vm04.stderr:client.8456 id 192.168.123.104:0/660690074 2026-03-23T18:19:03.201 INFO:tasks.workunit.client.0.vm04.stderr:client.8462 id 192.168.123.104:0/1414081755' ']' 2026-03-23T18:19:03.201 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock list test1 2026-03-23T18:19:03.201 INFO:tasks.workunit.client.0.vm04.stderr:+ tail -n 1 2026-03-23T18:19:03.201 INFO:tasks.workunit.client.0.vm04.stderr:+ awk '{print $2, $1;}' 2026-03-23T18:19:03.201 INFO:tasks.workunit.client.0.vm04.stderr:+ xargs rbd lock remove test1 2026-03-23T18:19:04.137 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd lock list test1 2026-03-23T18:19:04.170 INFO:tasks.workunit.client.0.vm04.stderr:+ '[' -n 'There is 1 shared lock on this image. 2026-03-23T18:19:04.170 INFO:tasks.workunit.client.0.vm04.stderr:Lock tag: tag 2026-03-23T18:19:04.170 INFO:tasks.workunit.client.0.vm04.stderr:Locker ID Address 2026-03-23T18:19:04.170 INFO:tasks.workunit.client.0.vm04.stderr:client.8456 id 192.168.123.104:0/660690074' ']' 2026-03-23T18:19:04.171 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock list test1 2026-03-23T18:19:04.171 INFO:tasks.workunit.client.0.vm04.stderr:+ tail -n 1 2026-03-23T18:19:04.171 INFO:tasks.workunit.client.0.vm04.stderr:+ awk '{print $2, $1;}' 2026-03-23T18:19:04.171 INFO:tasks.workunit.client.0.vm04.stderr:+ xargs rbd lock remove test1 2026-03-23T18:19:05.143 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd lock list test1 2026-03-23T18:19:05.179 INFO:tasks.workunit.client.0.vm04.stderr:+ '[' -n '' ']' 2026-03-23T18:19:05.179 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test1 2026-03-23T18:19:05.278 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:19:05.274+0000 7fbfe3d84640 0 -- 192.168.123.104:0/1530848172 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x56466e075880 msgr2=0x56466e065780 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:19:05.280 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:19:05.285 INFO:tasks.workunit.client.0.vm04.stderr:+ test_clone 2026-03-23T18:19:05.285 INFO:tasks.workunit.client.0.vm04.stdout:testing clone... 2026-03-23T18:19:05.285 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing clone...' 2026-03-23T18:19:05.285 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-23T18:19:05.285 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:05.357 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:05.433 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:05.512 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:05.587 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:05.663 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:05.745 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:05.824 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:05.900 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:05.976 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:06.052 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:06.136 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:06.212 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:06.495 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:06.574 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:06.655 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:06.732 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:06.812 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:06.892 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create test1 --image-format 2 -s 1 2026-03-23T18:19:06.939 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create test1@s1 2026-03-23T18:19:07.150 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-23T18:19:07.159 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap protect test1@s1 2026-03-23T18:19:07.198 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool create rbd2 8 2026-03-23T18:19:08.203 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd2' already exists 2026-03-23T18:19:08.224 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd pool init rbd2 2026-03-23T18:19:11.122 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone test1@s1 rbd2/clone 2026-03-23T18:19:11.181 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd -p rbd2 ls 2026-03-23T18:19:11.181 INFO:tasks.workunit.client.0.vm04.stderr:+ grep clone 2026-03-23T18:19:11.209 INFO:tasks.workunit.client.0.vm04.stdout:clone 2026-03-23T18:19:11.210 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd -p rbd2 ls -l 2026-03-23T18:19:11.210 INFO:tasks.workunit.client.0.vm04.stderr:+ grep clone 2026-03-23T18:19:11.210 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test1@s1 2026-03-23T18:19:11.251 INFO:tasks.workunit.client.0.vm04.stdout:clone 1 MiB rbd/test1@s1 2 2026-03-23T18:19:11.252 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd ls 2026-03-23T18:19:11.280 INFO:tasks.workunit.client.0.vm04.stderr:+ test test1 = test1 2026-03-23T18:19:11.280 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd flatten rbd2/clone 2026-03-23T18:19:11.320 INFO:tasks.workunit.client.0.vm04.stderr: Image flatten: 100% complete...done. 2026-03-23T18:19:11.327 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create rbd2/clone@s1 2026-03-23T18:19:12.168 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-23T18:19:12.175 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap protect rbd2/clone@s1 2026-03-23T18:19:12.213 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone rbd2/clone@s1 clone2 2026-03-23T18:19:12.280 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-23T18:19:12.280 INFO:tasks.workunit.client.0.vm04.stderr:+ grep clone2 2026-03-23T18:19:12.308 INFO:tasks.workunit.client.0.vm04.stdout:clone2 2026-03-23T18:19:12.308 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls -l 2026-03-23T18:19:12.308 INFO:tasks.workunit.client.0.vm04.stderr:+ grep clone2 2026-03-23T18:19:12.308 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/clone@s1 2026-03-23T18:19:12.349 INFO:tasks.workunit.client.0.vm04.stdout:clone2 1 MiB rbd2/clone@s1 2 2026-03-23T18:19:12.349 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd -p rbd2 ls 2026-03-23T18:19:12.381 INFO:tasks.workunit.client.0.vm04.stderr:+ test clone = clone 2026-03-23T18:19:12.381 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone rbd2/clone clone3 2026-03-23T18:19:12.381 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'snapshot name was not specified' 2026-03-23T18:19:12.400 INFO:tasks.workunit.client.0.vm04.stdout:rbd: snapshot name was not specified 2026-03-23T18:19:12.401 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone rbd2/clone@invalid clone3 2026-03-23T18:19:12.401 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'failed to open parent image' 2026-03-23T18:19:12.439 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-23T18:19:12.430+0000 7f3afaffd640 -1 librbd::image::CloneRequest: 0x55ec669c2d20 handle_open_parent: failed to open parent image: (2) No such file or directory 2026-03-23T18:19:12.439 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone rbd2/clone --snap-id 0 clone3 2026-03-23T18:19:12.439 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'failed to open parent image' 2026-03-23T18:19:12.474 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-23T18:19:12.466+0000 7ffac1ffb640 -1 librbd::image::CloneRequest: 0x560e39420e10 handle_open_parent: failed to open parent image: (2) No such file or directory 2026-03-23T18:19:12.474 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone rbd2/clone@invalid --snap-id 0 clone3 2026-03-23T18:19:12.474 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'trying to access snapshot using both name and id' 2026-03-23T18:19:12.495 INFO:tasks.workunit.client.0.vm04.stdout:rbd: trying to access snapshot using both name and id. 2026-03-23T18:19:12.496 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd snap ls rbd2/clone --format json 2026-03-23T18:19:12.496 INFO:tasks.workunit.client.0.vm04.stderr:++ jq '.[] | select(.name == "s1") | .id' 2026-03-23T18:19:12.530 INFO:tasks.workunit.client.0.vm04.stderr:+ SNAP_ID=19 2026-03-23T18:19:12.530 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --snap-id 19 rbd2/clone clone3 2026-03-23T18:19:12.590 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-23T18:19:12.590 INFO:tasks.workunit.client.0.vm04.stderr:+ grep clone3 2026-03-23T18:19:12.616 INFO:tasks.workunit.client.0.vm04.stdout:clone3 2026-03-23T18:19:12.616 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls -l 2026-03-23T18:19:12.616 INFO:tasks.workunit.client.0.vm04.stderr:+ grep clone3 2026-03-23T18:19:12.616 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/clone@s1 2026-03-23T18:19:12.663 INFO:tasks.workunit.client.0.vm04.stdout:clone3 1 MiB rbd2/clone@s1 2 2026-03-23T18:19:12.663 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd -p rbd2 ls 2026-03-23T18:19:12.692 INFO:tasks.workunit.client.0.vm04.stderr:+ test clone = clone 2026-03-23T18:19:12.692 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd ls -l 2026-03-23T18:19:12.692 INFO:tasks.workunit.client.0.vm04.stderr:++ grep -c rbd2/clone@s1 2026-03-23T18:19:12.740 INFO:tasks.workunit.client.0.vm04.stderr:+ test 2 = 2 2026-03-23T18:19:12.740 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd flatten clone3 2026-03-23T18:19:12.786 INFO:tasks.workunit.client.0.vm04.stderr: Image flatten: 100% complete...done. 2026-03-23T18:19:12.796 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd ls -l 2026-03-23T18:19:12.796 INFO:tasks.workunit.client.0.vm04.stderr:++ grep -c rbd2/clone@s1 2026-03-23T18:19:12.840 INFO:tasks.workunit.client.0.vm04.stderr:+ test 1 = 1 2026-03-23T18:19:12.840 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm clone2 2026-03-23T18:19:12.937 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:19:12.934+0000 7fb298f03640 0 -- 192.168.123.104:0/929686224 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7fb270008d30 msgr2=0x7fb2700291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:19:12.947 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:19:12.952 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap unprotect rbd2/clone@s1 2026-03-23T18:19:12.990 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm rbd2/clone@s1 2026-03-23T18:19:13.169 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-23T18:19:13.177 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm rbd2/clone 2026-03-23T18:19:13.239 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:19:13.238+0000 7fa610924640 0 -- 192.168.123.104:0/1809899758 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7fa5f005d170 msgr2=0x7fa5f007d570 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:19:13.246 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:19:13.250 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm clone3 2026-03-23T18:19:13.328 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:19:13.326+0000 7f0a0aeb0640 0 -- 192.168.123.104:0/580836443 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7f09e8006d80 msgr2=0x7f09e8007190 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:19:13.334 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:19:13.339 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap unprotect test1@s1 2026-03-23T18:19:13.380 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm test1@s1 2026-03-23T18:19:14.177 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-23T18:19:14.185 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test1 2026-03-23T18:19:14.274 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:19:14.270+0000 7f2efa800640 0 -- 192.168.123.104:0/3882111761 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7f2ed4012d30 msgr2=0x7f2ed40131a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:19:14.280 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:19:14.284 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool rm rbd2 rbd2 --yes-i-really-really-mean-it 2026-03-23T18:19:15.232 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd2' does not exist 2026-03-23T18:19:15.250 INFO:tasks.workunit.client.0.vm04.stdout:testing trash... 2026-03-23T18:19:15.250 INFO:tasks.workunit.client.0.vm04.stderr:+ test_trash 2026-03-23T18:19:15.250 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing trash...' 2026-03-23T18:19:15.250 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-23T18:19:15.250 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:15.332 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:15.417 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:15.524 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:15.605 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:15.715 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:15.796 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:15.885 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:15.966 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:16.045 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:16.128 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:16.206 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:16.298 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:16.380 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:16.462 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:16.544 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:16.627 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:16.711 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:16.794 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 -s 1 test1 2026-03-23T18:19:16.838 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 -s 1 test2 2026-03-23T18:19:16.880 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-23T18:19:16.880 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test1 2026-03-23T18:19:16.909 INFO:tasks.workunit.client.0.vm04.stdout:test1 2026-03-23T18:19:16.909 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-23T18:19:16.909 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test2 2026-03-23T18:19:16.939 INFO:tasks.workunit.client.0.vm04.stdout:test2 2026-03-23T18:19:16.939 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-23T18:19:16.939 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:16.939 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 2 2026-03-23T18:19:16.968 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-23T18:19:16.968 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls -l 2026-03-23T18:19:16.968 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'test1.*2.*' 2026-03-23T18:19:17.007 INFO:tasks.workunit.client.0.vm04.stdout:test1 1 MiB 2 2026-03-23T18:19:17.008 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls -l 2026-03-23T18:19:17.008 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'test2.*2.*' 2026-03-23T18:19:17.047 INFO:tasks.workunit.client.0.vm04.stdout:test2 1 MiB 2 2026-03-23T18:19:17.047 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv test1 2026-03-23T18:19:17.106 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-23T18:19:17.106 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test2 2026-03-23T18:19:17.133 INFO:tasks.workunit.client.0.vm04.stdout:test2 2026-03-23T18:19:17.133 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-23T18:19:17.134 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:17.134 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 1 2026-03-23T18:19:17.164 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-23T18:19:17.165 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls -l 2026-03-23T18:19:17.165 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'test2.*2.*' 2026-03-23T18:19:17.200 INFO:tasks.workunit.client.0.vm04.stdout:test2 1 MiB 2 2026-03-23T18:19:17.200 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:17.200 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test1 2026-03-23T18:19:17.229 INFO:tasks.workunit.client.0.vm04.stdout:227727dc9b9f test1 2026-03-23T18:19:17.230 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:17.230 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:17.230 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 1 2026-03-23T18:19:17.257 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-23T18:19:17.258 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls -l 2026-03-23T18:19:17.258 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'test1.*USER.*' 2026-03-23T18:19:17.291 INFO:tasks.workunit.client.0.vm04.stdout:227727dc9b9f test1 USER Mon Mar 23 18:19:17 2026 expired at Mon Mar 23 18:19:17 2026 2026-03-23T18:19:17.291 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls -l 2026-03-23T18:19:17.291 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -v 'protected until' 2026-03-23T18:19:17.324 INFO:tasks.workunit.client.0.vm04.stdout:ID NAME SOURCE DELETED_AT STATUS PARENT 2026-03-23T18:19:17.324 INFO:tasks.workunit.client.0.vm04.stdout:227727dc9b9f test1 USER Mon Mar 23 18:19:17 2026 expired at Mon Mar 23 18:19:17 2026 2026-03-23T18:19:17.324 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash ls 2026-03-23T18:19:17.325 INFO:tasks.workunit.client.0.vm04.stderr:++ cut -d ' ' -f 1 2026-03-23T18:19:17.351 INFO:tasks.workunit.client.0.vm04.stderr:+ ID=227727dc9b9f 2026-03-23T18:19:17.351 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash rm 227727dc9b9f 2026-03-23T18:19:17.401 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:19:17.405 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv test2 2026-03-23T18:19:17.472 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash ls 2026-03-23T18:19:17.472 INFO:tasks.workunit.client.0.vm04.stderr:++ cut -d ' ' -f 1 2026-03-23T18:19:17.499 INFO:tasks.workunit.client.0.vm04.stderr:+ ID=227afcb0bcbc 2026-03-23T18:19:17.500 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info --image-id 227afcb0bcbc 2026-03-23T18:19:17.500 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'rbd image '\''test2'\''' 2026-03-23T18:19:17.539 INFO:tasks.workunit.client.0.vm04.stdout:rbd image 'test2': 2026-03-23T18:19:17.539 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd children --image-id 227afcb0bcbc 2026-03-23T18:19:17.539 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:17.540 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 0 2026-03-23T18:19:17.573 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-23T18:19:17.573 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash restore 227afcb0bcbc 2026-03-23T18:19:17.618 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-23T18:19:17.618 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test2 2026-03-23T18:19:17.837 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:19:17.838+0000 7f73dd456640 0 --2- 192.168.123.104:0/3639332039 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x555b401625d0 0x555b401567f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 2026-03-23T18:19:17.845 INFO:tasks.workunit.client.0.vm04.stdout:test2 2026-03-23T18:19:17.845 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-23T18:19:17.845 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:17.845 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 1 2026-03-23T18:19:17.878 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-23T18:19:17.878 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls -l 2026-03-23T18:19:17.878 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'test2.*2.*' 2026-03-23T18:19:17.912 INFO:tasks.workunit.client.0.vm04.stdout:test2 1 MiB 2 2026-03-23T18:19:17.913 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv test2 --expires-at '3600 sec' 2026-03-23T18:19:17.978 INFO:tasks.workunit.client.0.vm04.stdout:rbd: image test2 will expire at 2026-03-23T19:19:17.945220+0000 2026-03-23T18:19:17.983 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:17.983 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test2 2026-03-23T18:19:18.007 INFO:tasks.workunit.client.0.vm04.stdout:227afcb0bcbc test2 2026-03-23T18:19:18.008 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:18.008 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:18.008 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 1 2026-03-23T18:19:18.031 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-23T18:19:18.031 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls -l 2026-03-23T18:19:18.031 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'test2.*USER.*protected until' 2026-03-23T18:19:18.056 INFO:tasks.workunit.client.0.vm04.stdout:227afcb0bcbc test2 USER Mon Mar 23 18:19:17 2026 protected until Mon Mar 23 19:19:17 2026 2026-03-23T18:19:18.056 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash rm 227afcb0bcbc 2026-03-23T18:19:18.056 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'Deferment time has not expired' 2026-03-23T18:19:18.076 INFO:tasks.workunit.client.0.vm04.stdout:Deferment time has not expired, please use --force if you really want to remove the image 2026-03-23T18:19:18.077 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash rm --image-id 227afcb0bcbc --force 2026-03-23T18:19:18.117 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:19:18.120 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 -s 1 test1 2026-03-23T18:19:18.160 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create test1@snap1 2026-03-23T18:19:19.193 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-23T18:19:19.203 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap protect test1@snap1 2026-03-23T18:19:19.269 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone test1@snap1 clone 2026-03-23T18:19:19.337 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv test1 2026-03-23T18:19:19.391 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:19.391 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test1 2026-03-23T18:19:19.414 INFO:tasks.workunit.client.0.vm04.stdout:22d1866f2cf8 test1 2026-03-23T18:19:19.415 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:19.415 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:19.415 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 1 2026-03-23T18:19:19.441 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-23T18:19:19.441 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls -l 2026-03-23T18:19:19.441 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'test1.*USER.*' 2026-03-23T18:19:19.476 INFO:tasks.workunit.client.0.vm04.stdout:22d1866f2cf8 test1 USER Mon Mar 23 18:19:19 2026 expired at Mon Mar 23 18:19:19 2026 2026-03-23T18:19:19.477 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls -l 2026-03-23T18:19:19.477 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -v 'protected until' 2026-03-23T18:19:19.514 INFO:tasks.workunit.client.0.vm04.stdout:ID NAME SOURCE DELETED_AT STATUS PARENT 2026-03-23T18:19:19.514 INFO:tasks.workunit.client.0.vm04.stdout:22d1866f2cf8 test1 USER Mon Mar 23 18:19:19 2026 expired at Mon Mar 23 18:19:19 2026 2026-03-23T18:19:19.514 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash ls 2026-03-23T18:19:19.515 INFO:tasks.workunit.client.0.vm04.stderr:++ cut -d ' ' -f 1 2026-03-23T18:19:19.550 INFO:tasks.workunit.client.0.vm04.stderr:+ ID=22d1866f2cf8 2026-03-23T18:19:19.550 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap ls --image-id 22d1866f2cf8 2026-03-23T18:19:19.550 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -v SNAPID 2026-03-23T18:19:19.550 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:19.550 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 1 2026-03-23T18:19:19.592 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-23T18:19:19.593 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap ls --image-id 22d1866f2cf8 2026-03-23T18:19:19.593 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '.*snap1.*' 2026-03-23T18:19:19.628 INFO:tasks.workunit.client.0.vm04.stdout: 20 snap1 1 MiB yes Mon Mar 23 18:19:19 2026 2026-03-23T18:19:19.629 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd children --image-id 22d1866f2cf8 2026-03-23T18:19:19.629 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:19.629 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 1 2026-03-23T18:19:19.666 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-23T18:19:19.667 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd children --image-id 22d1866f2cf8 2026-03-23T18:19:19.667 INFO:tasks.workunit.client.0.vm04.stderr:+ grep clone 2026-03-23T18:19:19.701 INFO:tasks.workunit.client.0.vm04.stdout:rbd/clone 2026-03-23T18:19:19.702 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm clone 2026-03-23T18:19:19.982 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:19:19.978+0000 7f50aeee2640 0 -- 192.168.123.104:0/1669975666 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7f508c05ca30 msgr2=0x7f508c07ce30 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:19:19.991 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:19:19.995 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap unprotect --image-id 22d1866f2cf8 --snap snap1 2026-03-23T18:19:20.059 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm --image-id 22d1866f2cf8 --snap snap1 2026-03-23T18:19:21.141 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-23T18:19:21.152 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap ls --image-id 22d1866f2cf8 2026-03-23T18:19:21.153 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -v SNAPID 2026-03-23T18:19:21.153 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:21.153 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 0 2026-03-23T18:19:21.183 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-23T18:19:21.183 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash restore 22d1866f2cf8 2026-03-23T18:19:21.220 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create test1@snap1 2026-03-23T18:19:22.241 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-23T18:19:22.279 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create test1@snap2 2026-03-23T18:19:23.229 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-23T18:19:23.241 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap ls --image-id 22d1866f2cf8 2026-03-23T18:19:23.241 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -v SNAPID 2026-03-23T18:19:23.241 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:23.241 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 2 2026-03-23T18:19:23.277 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-23T18:19:23.277 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap purge --image-id 22d1866f2cf8 2026-03-23T18:19:25.246 INFO:tasks.workunit.client.0.vm04.stderr: Removing all snapshots: 50% complete... Removing all snapshots: 100% complete... Removing all snapshots: 100% complete...done. 2026-03-23T18:19:25.255 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap ls --image-id 22d1866f2cf8 2026-03-23T18:19:25.255 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -v SNAPID 2026-03-23T18:19:25.255 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:25.255 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 0 2026-03-23T18:19:25.287 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-23T18:19:25.287 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm --rbd_move_to_trash_on_remove=true --rbd_move_to_trash_on_remove_expire_seconds=3600 test1 2026-03-23T18:19:25.334 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:19:25.340 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:25.340 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test1 2026-03-23T18:19:25.366 INFO:tasks.workunit.client.0.vm04.stdout:22d1866f2cf8 test1 2026-03-23T18:19:25.366 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:25.366 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:25.367 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 1 2026-03-23T18:19:25.392 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-23T18:19:25.392 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls -l 2026-03-23T18:19:25.393 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'test1.*USER.*protected until' 2026-03-23T18:19:25.424 INFO:tasks.workunit.client.0.vm04.stdout:22d1866f2cf8 test1 USER Mon Mar 23 18:19:25 2026 protected until Mon Mar 23 19:19:25 2026 2026-03-23T18:19:25.424 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash rm 22d1866f2cf8 2026-03-23T18:19:25.424 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'Deferment time has not expired' 2026-03-23T18:19:25.450 INFO:tasks.workunit.client.0.vm04.stdout:Deferment time has not expired, please use --force if you really want to remove the image 2026-03-23T18:19:25.450 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash rm --image-id 22d1866f2cf8 --force 2026-03-23T18:19:25.496 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:19:25.500 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-23T18:19:25.500 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:25.573 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:25.848 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:25.926 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:26.004 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:26.082 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:26.153 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:26.224 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:26.296 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:26.367 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:26.433 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:26.503 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:26.572 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:26.646 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:26.717 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:26.784 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:26.855 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:26.928 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:27.001 INFO:tasks.workunit.client.0.vm04.stdout:testing trash purge... 2026-03-23T18:19:27.001 INFO:tasks.workunit.client.0.vm04.stderr:+ test_purge 2026-03-23T18:19:27.001 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing trash purge...' 2026-03-23T18:19:27.001 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-23T18:19:27.001 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:27.071 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:27.140 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:27.210 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:27.267 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:27.346 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:27.418 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:27.499 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:27.571 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:27.640 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:27.714 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:27.790 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:27.868 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:27.944 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:28.018 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:28.089 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:28.165 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:28.241 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:19:28.317 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:28.317 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:28.317 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 0 2026-03-23T18:19:28.344 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-23T18:19:28.344 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-23T18:19:28.370 INFO:tasks.workunit.client.0.vm04.stderr: Removing images: 100% complete...done. 2026-03-23T18:19:28.374 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 256 testimg1 2026-03-23T18:19:28.412 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 256 testimg2 2026-03-23T18:19:28.454 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg1 2026-03-23T18:19:28.510 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg2 2026-03-23T18:19:28.567 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:28.567 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:28.567 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 2 2026-03-23T18:19:28.596 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-23T18:19:28.596 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-23T18:19:28.675 INFO:tasks.workunit.client.0.vm04.stderr: Removing images: 50% complete... Removing images: 100% complete... Removing images: 100% complete...done. 2026-03-23T18:19:28.680 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:28.680 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:28.680 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 0 2026-03-23T18:19:28.704 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-23T18:19:28.704 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 256 testimg1 2026-03-23T18:19:28.743 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 256 testimg2 2026-03-23T18:19:28.786 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg1 --expires-at '1 hour' 2026-03-23T18:19:28.833 INFO:tasks.workunit.client.0.vm04.stdout:rbd: image testimg1 will expire at 2026-03-23T19:19:28.813010+0000 2026-03-23T18:19:28.838 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg2 --expires-at '3 hours' 2026-03-23T18:19:28.890 INFO:tasks.workunit.client.0.vm04.stdout:rbd: image testimg2 will expire at 2026-03-23T21:19:28.865080+0000 2026-03-23T18:19:28.894 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:28.894 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:28.894 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 2 2026-03-23T18:19:28.925 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-23T18:19:28.925 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-23T18:19:28.949 INFO:tasks.workunit.client.0.vm04.stderr: Removing images: 100% complete...done. 2026-03-23T18:19:28.953 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:28.953 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:28.953 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 2 2026-03-23T18:19:28.980 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-23T18:19:28.980 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge --expired-before 'now + 2 hours' 2026-03-23T18:19:29.031 INFO:tasks.workunit.client.0.vm04.stderr: Removing images: 100% complete... Removing images: 100% complete...done. 2026-03-23T18:19:29.035 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:29.035 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:29.036 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 1 2026-03-23T18:19:29.062 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-23T18:19:29.062 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:29.062 INFO:tasks.workunit.client.0.vm04.stderr:+ grep testimg2 2026-03-23T18:19:29.088 INFO:tasks.workunit.client.0.vm04.stdout:2418e6252b1d testimg2 2026-03-23T18:19:29.088 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge --expired-before 'now + 4 hours' 2026-03-23T18:19:29.141 INFO:tasks.workunit.client.0.vm04.stderr: Removing images: 100% complete... Removing images: 100% complete...done. 2026-03-23T18:19:29.154 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:29.154 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:29.154 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 0 2026-03-23T18:19:29.186 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-23T18:19:29.186 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 256 testimg1 2026-03-23T18:19:29.228 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create testimg1@snap 2026-03-23T18:19:30.133 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-23T18:19:30.144 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 256 testimg2 2026-03-23T18:19:30.184 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 256 testimg3 2026-03-23T18:19:30.227 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg1 2026-03-23T18:19:30.283 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg2 2026-03-23T18:19:30.348 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg3 2026-03-23T18:19:30.400 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:30.400 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:30.400 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 3 2026-03-23T18:19:30.427 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-23T18:19:30.427 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-23T18:19:30.427 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'some expired images could not be removed' 2026-03-23T18:19:30.530 INFO:tasks.workunit.client.0.vm04.stdout:rbd: some expired images could not be removed 2026-03-23T18:19:30.531 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:30.531 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:30.531 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 1 2026-03-23T18:19:30.554 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-23T18:19:30.554 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:30.555 INFO:tasks.workunit.client.0.vm04.stderr:+ grep testimg1 2026-03-23T18:19:30.576 INFO:tasks.workunit.client.0.vm04.stdout:2438d8e9fc55 testimg1 2026-03-23T18:19:30.577 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash ls 2026-03-23T18:19:30.577 INFO:tasks.workunit.client.0.vm04.stderr:++ awk '{ print $1 }' 2026-03-23T18:19:30.600 INFO:tasks.workunit.client.0.vm04.stderr:+ ID=2438d8e9fc55 2026-03-23T18:19:30.600 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap purge --image-id 2438d8e9fc55 2026-03-23T18:19:31.135 INFO:tasks.workunit.client.0.vm04.stderr: Removing all snapshots: 100% complete... Removing all snapshots: 100% complete...done. 2026-03-23T18:19:31.142 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-23T18:19:31.188 INFO:tasks.workunit.client.0.vm04.stderr: Removing images: 100% complete... Removing images: 100% complete...done. 2026-03-23T18:19:31.193 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:31.193 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:31.193 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 0 2026-03-23T18:19:31.216 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-23T18:19:31.216 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 256 testimg1 2026-03-23T18:19:31.250 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 256 testimg2 2026-03-23T18:19:31.287 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create testimg2@snap 2026-03-23T18:19:32.139 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-23T18:19:32.146 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 256 testimg3 2026-03-23T18:19:32.185 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg1 2026-03-23T18:19:32.237 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg2 2026-03-23T18:19:32.496 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg3 2026-03-23T18:19:32.554 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:32.554 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:32.554 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 3 2026-03-23T18:19:32.581 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-23T18:19:32.581 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-23T18:19:32.581 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'some expired images could not be removed' 2026-03-23T18:19:32.685 INFO:tasks.workunit.client.0.vm04.stdout:rbd: some expired images could not be removed 2026-03-23T18:19:32.686 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:32.686 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:32.686 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 1 2026-03-23T18:19:32.713 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-23T18:19:32.713 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:32.713 INFO:tasks.workunit.client.0.vm04.stderr:+ grep testimg2 2026-03-23T18:19:32.736 INFO:tasks.workunit.client.0.vm04.stdout:2465bd9de3ad testimg2 2026-03-23T18:19:32.737 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash ls 2026-03-23T18:19:32.737 INFO:tasks.workunit.client.0.vm04.stderr:++ awk '{ print $1 }' 2026-03-23T18:19:32.762 INFO:tasks.workunit.client.0.vm04.stderr:+ ID=2465bd9de3ad 2026-03-23T18:19:32.762 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap purge --image-id 2465bd9de3ad 2026-03-23T18:19:33.142 INFO:tasks.workunit.client.0.vm04.stderr: Removing all snapshots: 100% complete... Removing all snapshots: 100% complete...done. 2026-03-23T18:19:33.151 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-23T18:19:33.196 INFO:tasks.workunit.client.0.vm04.stderr: Removing images: 100% complete... Removing images: 100% complete...done. 2026-03-23T18:19:33.199 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:33.199 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:33.199 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 0 2026-03-23T18:19:33.222 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-23T18:19:33.222 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 256 testimg1 2026-03-23T18:19:33.259 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 256 testimg2 2026-03-23T18:19:33.501 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 256 testimg3 2026-03-23T18:19:33.540 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create testimg3@snap 2026-03-23T18:19:34.153 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-23T18:19:34.165 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg1 2026-03-23T18:19:34.213 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg2 2026-03-23T18:19:34.258 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg3 2026-03-23T18:19:34.306 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:34.306 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:34.306 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 3 2026-03-23T18:19:34.330 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-23T18:19:34.330 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-23T18:19:34.330 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'some expired images could not be removed' 2026-03-23T18:19:34.430 INFO:tasks.workunit.client.0.vm04.stdout:rbd: some expired images could not be removed 2026-03-23T18:19:34.430 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:34.430 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:34.430 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 1 2026-03-23T18:19:34.455 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-23T18:19:34.456 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:34.456 INFO:tasks.workunit.client.0.vm04.stderr:+ grep testimg3 2026-03-23T18:19:34.480 INFO:tasks.workunit.client.0.vm04.stdout:249546a9f410 testimg3 2026-03-23T18:19:34.480 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash ls 2026-03-23T18:19:34.480 INFO:tasks.workunit.client.0.vm04.stderr:++ awk '{ print $1 }' 2026-03-23T18:19:34.506 INFO:tasks.workunit.client.0.vm04.stderr:+ ID=249546a9f410 2026-03-23T18:19:34.506 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap purge --image-id 249546a9f410 2026-03-23T18:19:35.156 INFO:tasks.workunit.client.0.vm04.stderr: Removing all snapshots: 100% complete... Removing all snapshots: 100% complete...done. 2026-03-23T18:19:35.164 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-23T18:19:35.210 INFO:tasks.workunit.client.0.vm04.stderr: Removing images: 100% complete... Removing images: 100% complete...done. 2026-03-23T18:19:35.213 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:35.214 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:35.214 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 0 2026-03-23T18:19:35.239 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-23T18:19:35.239 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 256 testimg1 2026-03-23T18:19:35.274 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create testimg1@snap 2026-03-23T18:19:36.162 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-23T18:19:36.170 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format=2 testimg1@snap testimg2 2026-03-23T18:19:36.218 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm testimg1@snap 2026-03-23T18:19:36.251 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-23T18:19:36.258 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 256 testimg3 2026-03-23T18:19:36.298 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create testimg2@snap 2026-03-23T18:19:37.171 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-23T18:19:37.182 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format=2 testimg2@snap testimg4 2026-03-23T18:19:37.238 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format=2 testimg2@snap testimg5 2026-03-23T18:19:37.300 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm testimg2@snap 2026-03-23T18:19:37.341 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-23T18:19:37.348 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create testimg4@snap 2026-03-23T18:19:38.171 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-23T18:19:38.180 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format=2 testimg4@snap testimg6 2026-03-23T18:19:38.244 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm testimg4@snap 2026-03-23T18:19:38.296 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-23T18:19:38.305 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg1 2026-03-23T18:19:38.363 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg2 2026-03-23T18:19:38.421 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg3 2026-03-23T18:19:38.476 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg4 2026-03-23T18:19:38.572 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:38.572 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:38.572 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 4 2026-03-23T18:19:38.599 INFO:tasks.workunit.client.0.vm04.stdout:4 2026-03-23T18:19:38.637 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-23T18:19:38.637 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'some expired images could not be removed' 2026-03-23T18:19:38.794 INFO:tasks.workunit.client.0.vm04.stdout:rbd: some expired images could not be removed 2026-03-23T18:19:38.794 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:38.794 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:38.794 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 3 2026-03-23T18:19:38.819 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-23T18:19:38.819 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:38.820 INFO:tasks.workunit.client.0.vm04.stderr:+ grep testimg1 2026-03-23T18:19:38.845 INFO:tasks.workunit.client.0.vm04.stdout:24bcfe58be9 testimg1 2026-03-23T18:19:38.845 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:38.845 INFO:tasks.workunit.client.0.vm04.stderr:+ grep testimg2 2026-03-23T18:19:38.872 INFO:tasks.workunit.client.0.vm04.stdout:24c242ec292c testimg2 2026-03-23T18:19:38.872 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:38.872 INFO:tasks.workunit.client.0.vm04.stderr:+ grep testimg4 2026-03-23T18:19:38.902 INFO:tasks.workunit.client.0.vm04.stdout:24cebd05fa35 testimg4 2026-03-23T18:19:38.902 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg6 2026-03-23T18:19:38.961 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:38.961 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:38.961 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 4 2026-03-23T18:19:38.989 INFO:tasks.workunit.client.0.vm04.stdout:4 2026-03-23T18:19:38.989 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-23T18:19:38.989 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'some expired images could not be removed' 2026-03-23T18:19:40.268 INFO:tasks.workunit.client.0.vm04.stdout:rbd: some expired images could not be removed 2026-03-23T18:19:40.269 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:40.269 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:40.269 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 2 2026-03-23T18:19:40.295 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-23T18:19:40.296 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:40.296 INFO:tasks.workunit.client.0.vm04.stderr:+ grep testimg1 2026-03-23T18:19:40.324 INFO:tasks.workunit.client.0.vm04.stdout:24bcfe58be9 testimg1 2026-03-23T18:19:40.325 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:40.325 INFO:tasks.workunit.client.0.vm04.stderr:+ grep testimg2 2026-03-23T18:19:40.348 INFO:tasks.workunit.client.0.vm04.stdout:24c242ec292c testimg2 2026-03-23T18:19:40.348 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg5 2026-03-23T18:19:40.402 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:40.402 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:40.402 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 3 2026-03-23T18:19:40.430 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-23T18:19:40.430 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-23T18:19:40.496 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:19:40.494+0000 7ff32121b640 0 -- 192.168.123.104:0/3919611929 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55ca84f27620 msgr2=0x55ca84f597e0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:19:40.501 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:19:40.498+0000 7ff31ff92640 0 -- 192.168.123.104:0/3919611929 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x7ff2fc0467f0 msgr2=0x7ff2fc066bd0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:19:42.228 INFO:tasks.workunit.client.0.vm04.stderr: Removing images: 33% complete... Removing images: 66% complete... Removing images: 100% complete... Removing images: 100% complete...done. 2026-03-23T18:19:42.232 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:42.232 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:42.232 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 0 2026-03-23T18:19:42.258 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-23T18:19:42.258 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 256 testimg1 2026-03-23T18:19:42.294 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create testimg1@snap 2026-03-23T18:19:43.193 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-23T18:19:43.200 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format=2 testimg1@snap testimg2 2026-03-23T18:19:43.243 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm testimg1@snap 2026-03-23T18:19:43.273 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-23T18:19:43.278 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 256 testimg3 2026-03-23T18:19:43.516 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create testimg3@snap 2026-03-23T18:19:44.202 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-23T18:19:44.212 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create testimg2@snap 2026-03-23T18:19:45.209 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-23T18:19:45.219 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format=2 testimg2@snap testimg4 2026-03-23T18:19:45.280 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format=2 testimg2@snap testimg5 2026-03-23T18:19:45.336 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm testimg2@snap 2026-03-23T18:19:45.381 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-23T18:19:45.391 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create testimg4@snap 2026-03-23T18:19:46.210 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-23T18:19:46.220 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format=2 testimg4@snap testimg6 2026-03-23T18:19:46.290 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm testimg4@snap 2026-03-23T18:19:46.341 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-23T18:19:46.351 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg1 2026-03-23T18:19:46.407 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg2 2026-03-23T18:19:46.458 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg3 2026-03-23T18:19:46.505 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg4 2026-03-23T18:19:46.553 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:46.553 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:46.553 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 4 2026-03-23T18:19:46.578 INFO:tasks.workunit.client.0.vm04.stdout:4 2026-03-23T18:19:46.578 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-23T18:19:46.578 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'some expired images could not be removed' 2026-03-23T18:19:46.699 INFO:tasks.workunit.client.0.vm04.stdout:rbd: some expired images could not be removed 2026-03-23T18:19:46.700 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:46.700 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:46.700 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 4 2026-03-23T18:19:46.724 INFO:tasks.workunit.client.0.vm04.stdout:4 2026-03-23T18:19:46.724 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg6 2026-03-23T18:19:46.773 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:46.773 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:46.773 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 5 2026-03-23T18:19:46.796 INFO:tasks.workunit.client.0.vm04.stdout:5 2026-03-23T18:19:46.797 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-23T18:19:46.797 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'some expired images could not be removed' 2026-03-23T18:19:47.348 INFO:tasks.workunit.client.0.vm04.stdout:rbd: some expired images could not be removed 2026-03-23T18:19:47.349 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:47.349 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:47.349 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 3 2026-03-23T18:19:47.377 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-23T18:19:47.377 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:47.377 INFO:tasks.workunit.client.0.vm04.stderr:+ grep testimg1 2026-03-23T18:19:47.401 INFO:tasks.workunit.client.0.vm04.stdout:251b3179cd1f testimg1 2026-03-23T18:19:47.401 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:47.401 INFO:tasks.workunit.client.0.vm04.stderr:+ grep testimg2 2026-03-23T18:19:47.425 INFO:tasks.workunit.client.0.vm04.stdout:25215ea078ce testimg2 2026-03-23T18:19:47.425 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:47.425 INFO:tasks.workunit.client.0.vm04.stderr:+ grep testimg3 2026-03-23T18:19:47.643 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:19:47.642+0000 7f63f5951640 0 --2- 192.168.123.104:0/819344463 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x55e861644380 0x55e861638eb0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 2026-03-23T18:19:47.650 INFO:tasks.workunit.client.0.vm04.stdout:2527c4bd5656 testimg3 2026-03-23T18:19:47.650 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg5 2026-03-23T18:19:47.695 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:47.695 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:47.695 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 4 2026-03-23T18:19:47.721 INFO:tasks.workunit.client.0.vm04.stdout:4 2026-03-23T18:19:47.721 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-23T18:19:47.721 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'some expired images could not be removed' 2026-03-23T18:19:49.308 INFO:tasks.workunit.client.0.vm04.stdout:rbd: some expired images could not be removed 2026-03-23T18:19:49.308 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:49.308 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:49.309 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 1 2026-03-23T18:19:49.333 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-23T18:19:49.334 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:49.334 INFO:tasks.workunit.client.0.vm04.stderr:+ grep testimg3 2026-03-23T18:19:49.360 INFO:tasks.workunit.client.0.vm04.stdout:2527c4bd5656 testimg3 2026-03-23T18:19:49.361 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash ls 2026-03-23T18:19:49.361 INFO:tasks.workunit.client.0.vm04.stderr:++ awk '{ print $1 }' 2026-03-23T18:19:49.390 INFO:tasks.workunit.client.0.vm04.stderr:+ ID=2527c4bd5656 2026-03-23T18:19:49.391 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap purge --image-id 2527c4bd5656 2026-03-23T18:19:50.130 INFO:tasks.workunit.client.0.vm04.stderr: Removing all snapshots: 100% complete... Removing all snapshots: 100% complete...done. 2026-03-23T18:19:50.137 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-23T18:19:50.190 INFO:tasks.workunit.client.0.vm04.stderr: Removing images: 100% complete... Removing images: 100% complete...done. 2026-03-23T18:19:50.195 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:50.195 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:50.195 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 0 2026-03-23T18:19:50.223 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-23T18:19:50.223 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 256 testimg1 2026-03-23T18:19:50.261 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create testimg1@snap 2026-03-23T18:19:51.229 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-23T18:19:51.237 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format=2 testimg1@snap testimg2 2026-03-23T18:19:51.294 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm testimg1@snap 2026-03-23T18:19:51.327 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-23T18:19:51.334 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 256 testimg3 2026-03-23T18:19:51.378 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create testimg2@snap 2026-03-23T18:19:52.234 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-23T18:19:52.242 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format=2 testimg2@snap testimg4 2026-03-23T18:19:52.304 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format=2 testimg2@snap testimg5 2026-03-23T18:19:52.365 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm testimg2@snap 2026-03-23T18:19:52.407 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-23T18:19:52.416 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create testimg4@snap 2026-03-23T18:19:53.237 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-23T18:19:53.249 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format=2 testimg4@snap testimg6 2026-03-23T18:19:53.317 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm testimg4@snap 2026-03-23T18:19:53.367 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-23T18:19:53.375 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm --rbd_move_parent_to_trash_on_remove=true testimg1 2026-03-23T18:19:53.436 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:19:53.440 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm --rbd_move_parent_to_trash_on_remove=true testimg2 2026-03-23T18:19:53.502 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:19:53.506 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg3 2026-03-23T18:19:53.554 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm --rbd_move_parent_to_trash_on_remove=true testimg4 2026-03-23T18:19:53.618 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:19:53.623 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:53.623 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:53.623 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 4 2026-03-23T18:19:53.646 INFO:tasks.workunit.client.0.vm04.stdout:4 2026-03-23T18:19:53.647 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-23T18:19:53.647 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'some expired images could not be removed' 2026-03-23T18:19:53.895 INFO:tasks.workunit.client.0.vm04.stdout:rbd: some expired images could not be removed 2026-03-23T18:19:53.895 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:53.895 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:53.895 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 3 2026-03-23T18:19:53.923 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-23T18:19:53.923 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:53.923 INFO:tasks.workunit.client.0.vm04.stderr:+ grep testimg1 2026-03-23T18:19:53.949 INFO:tasks.workunit.client.0.vm04.stdout:258584af1751 testimg1 2026-03-23T18:19:53.949 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:53.950 INFO:tasks.workunit.client.0.vm04.stderr:+ grep testimg2 2026-03-23T18:19:53.977 INFO:tasks.workunit.client.0.vm04.stdout:258b2cb7aacf testimg2 2026-03-23T18:19:53.978 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:53.978 INFO:tasks.workunit.client.0.vm04.stderr:+ grep testimg4 2026-03-23T18:19:54.203 INFO:tasks.workunit.client.0.vm04.stdout:259763b5608 testimg4 2026-03-23T18:19:54.203 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg6 2026-03-23T18:19:54.252 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:54.252 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:54.252 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 4 2026-03-23T18:19:54.278 INFO:tasks.workunit.client.0.vm04.stdout:4 2026-03-23T18:19:54.278 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-23T18:19:54.278 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'some expired images could not be removed' 2026-03-23T18:19:55.308 INFO:tasks.workunit.client.0.vm04.stdout:rbd: some expired images could not be removed 2026-03-23T18:19:55.308 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:55.309 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:55.309 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 2 2026-03-23T18:19:55.332 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-23T18:19:55.332 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:55.332 INFO:tasks.workunit.client.0.vm04.stderr:+ grep testimg1 2026-03-23T18:19:55.355 INFO:tasks.workunit.client.0.vm04.stdout:258584af1751 testimg1 2026-03-23T18:19:55.355 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:55.355 INFO:tasks.workunit.client.0.vm04.stderr:+ grep testimg2 2026-03-23T18:19:55.380 INFO:tasks.workunit.client.0.vm04.stdout:258b2cb7aacf testimg2 2026-03-23T18:19:55.380 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg5 2026-03-23T18:19:55.431 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:55.431 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:55.431 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 3 2026-03-23T18:19:55.458 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-23T18:19:55.458 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-23T18:19:55.536 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:19:55.534+0000 7f71bea8e640 0 -- 192.168.123.104:0/3695911367 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7f719c05c9f0 msgr2=0x7f719c07cdf0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:19:55.544 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:19:55.542+0000 7f71bd004640 0 -- 192.168.123.104:0/3695911367 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x7f71a0006bb0 msgr2=0x7f71a0026f90 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:19:57.312 INFO:tasks.workunit.client.0.vm04.stderr: Removing images: 33% complete... Removing images: 66% complete... Removing images: 100% complete... Removing images: 100% complete...done. 2026-03-23T18:19:57.316 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:19:57.316 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:19:57.316 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 0 2026-03-23T18:19:57.341 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-23T18:19:57.341 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 256 testimg1 2026-03-23T18:19:57.380 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create testimg1@snap 2026-03-23T18:19:58.269 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-23T18:19:58.278 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format=2 testimg1@snap testimg2 2026-03-23T18:19:58.327 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm testimg1@snap 2026-03-23T18:19:58.364 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-23T18:19:58.372 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 256 testimg3 2026-03-23T18:19:58.417 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create testimg3@snap 2026-03-23T18:19:59.280 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-23T18:19:59.288 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create testimg2@snap 2026-03-23T18:20:00.287 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-23T18:20:00.294 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format=2 testimg2@snap testimg4 2026-03-23T18:20:00.345 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format=2 testimg2@snap testimg5 2026-03-23T18:20:00.400 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm testimg2@snap 2026-03-23T18:20:00.437 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-23T18:20:00.444 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create testimg4@snap 2026-03-23T18:20:01.291 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-23T18:20:01.298 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format=2 testimg4@snap testimg6 2026-03-23T18:20:01.352 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm testimg4@snap 2026-03-23T18:20:01.398 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-23T18:20:01.407 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm --rbd_move_parent_to_trash_on_remove=true testimg1 2026-03-23T18:20:01.471 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:20:01.475 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm --rbd_move_parent_to_trash_on_remove=true testimg2 2026-03-23T18:20:01.540 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:20:01.544 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg3 2026-03-23T18:20:01.600 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm --rbd_move_parent_to_trash_on_remove=true testimg4 2026-03-23T18:20:01.663 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:20:01.667 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:20:01.667 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:20:01.667 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 4 2026-03-23T18:20:01.695 INFO:tasks.workunit.client.0.vm04.stdout:4 2026-03-23T18:20:01.695 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-23T18:20:01.695 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'some expired images could not be removed' 2026-03-23T18:20:01.783 INFO:tasks.workunit.client.0.vm04.stdout:rbd: some expired images could not be removed 2026-03-23T18:20:01.784 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:20:01.784 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:20:01.784 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 4 2026-03-23T18:20:01.807 INFO:tasks.workunit.client.0.vm04.stdout:4 2026-03-23T18:20:01.807 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg6 2026-03-23T18:20:01.864 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:20:01.864 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:20:01.864 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 5 2026-03-23T18:20:01.890 INFO:tasks.workunit.client.0.vm04.stdout:5 2026-03-23T18:20:01.890 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-23T18:20:01.890 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'some expired images could not be removed' 2026-03-23T18:20:02.365 INFO:tasks.workunit.client.0.vm04.stdout:rbd: some expired images could not be removed 2026-03-23T18:20:02.366 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:20:02.366 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:20:02.366 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 3 2026-03-23T18:20:02.390 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-23T18:20:02.390 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:20:02.390 INFO:tasks.workunit.client.0.vm04.stderr:+ grep testimg1 2026-03-23T18:20:02.413 INFO:tasks.workunit.client.0.vm04.stdout:25e09372fe22 testimg1 2026-03-23T18:20:02.413 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:20:02.413 INFO:tasks.workunit.client.0.vm04.stderr:+ grep testimg2 2026-03-23T18:20:02.435 INFO:tasks.workunit.client.0.vm04.stdout:25e6805137bb testimg2 2026-03-23T18:20:02.435 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:20:02.435 INFO:tasks.workunit.client.0.vm04.stderr:+ grep testimg3 2026-03-23T18:20:02.458 INFO:tasks.workunit.client.0.vm04.stdout:25ec55c0eb04 testimg3 2026-03-23T18:20:02.458 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg5 2026-03-23T18:20:02.506 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:20:02.506 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:20:02.506 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 4 2026-03-23T18:20:02.529 INFO:tasks.workunit.client.0.vm04.stdout:4 2026-03-23T18:20:02.529 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-23T18:20:02.530 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'some expired images could not be removed' 2026-03-23T18:20:04.368 INFO:tasks.workunit.client.0.vm04.stdout:rbd: some expired images could not be removed 2026-03-23T18:20:04.368 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:20:04.368 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:20:04.368 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 1 2026-03-23T18:20:04.393 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-23T18:20:04.393 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:20:04.393 INFO:tasks.workunit.client.0.vm04.stderr:+ grep testimg3 2026-03-23T18:20:04.611 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:20:04.610+0000 7f7c4b7fe640 0 --2- 192.168.123.104:0/59633482 >> v2:192.168.123.104:3300/0 conn(0x55c74a7f49d0 0x55c74a814db0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 2026-03-23T18:20:04.621 INFO:tasks.workunit.client.0.vm04.stdout:25ec55c0eb04 testimg3 2026-03-23T18:20:04.621 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash ls 2026-03-23T18:20:04.622 INFO:tasks.workunit.client.0.vm04.stderr:++ awk '{ print $1 }' 2026-03-23T18:20:04.648 INFO:tasks.workunit.client.0.vm04.stderr:+ ID=25ec55c0eb04 2026-03-23T18:20:04.648 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap purge --image-id 25ec55c0eb04 2026-03-23T18:20:05.311 INFO:tasks.workunit.client.0.vm04.stderr: Removing all snapshots: 100% complete... Removing all snapshots: 100% complete...done. 2026-03-23T18:20:05.321 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-23T18:20:05.382 INFO:tasks.workunit.client.0.vm04.stderr: Removing images: 100% complete... Removing images: 100% complete...done. 2026-03-23T18:20:05.387 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-23T18:20:05.387 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:20:05.387 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 0 2026-03-23T18:20:05.412 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-23T18:20:05.412 INFO:tasks.workunit.client.0.vm04.stderr:+ test_deep_copy_clone 2026-03-23T18:20:05.412 INFO:tasks.workunit.client.0.vm04.stdout:testing deep copy clone... 2026-03-23T18:20:05.413 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing deep copy clone...' 2026-03-23T18:20:05.413 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-23T18:20:05.413 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:05.496 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:05.574 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:05.652 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:05.724 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:05.799 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:05.882 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:05.953 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:06.030 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:06.110 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:06.186 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:06.262 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:06.339 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:06.416 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:06.498 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:06.580 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:06.657 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:06.735 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:06.820 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create testimg1 --image-format 2 --size 256 2026-03-23T18:20:06.860 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create testimg1 --snap=snap1 2026-03-23T18:20:07.326 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-23T18:20:07.335 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap protect testimg1@snap1 2026-03-23T18:20:07.371 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone testimg1@snap1 testimg2 2026-03-23T18:20:07.457 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create testimg2@snap2 2026-03-23T18:20:08.330 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-23T18:20:08.343 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd deep copy testimg2 testimg3 2026-03-23T18:20:08.421 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:20:08.418+0000 7f154952b640 0 -- 192.168.123.104:0/3970489344 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x563f5216a420 msgr2=0x563f522a9cc0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:20:09.341 INFO:tasks.workunit.client.0.vm04.stderr: Image deep copy: 1% complete... Image deep copy: 3% complete... Image deep copy: 4% complete... Image deep copy: 6% complete... Image deep copy: 7% complete... Image deep copy: 9% complete... Image deep copy: 10% complete... Image deep copy: 12% complete... Image deep copy: 14% complete... Image deep copy: 15% complete... Image deep copy: 17% complete... Image deep copy: 18% complete... Image deep copy: 20% complete... Image deep copy: 21% complete... Image deep copy: 23% complete... Image deep copy: 25% complete... Image deep copy: 26% complete... Image deep copy: 28% complete... Image deep copy: 29% complete... Image deep copy: 31% complete... Image deep copy: 32% complete... Image deep copy: 34% complete... Image deep copy: 35% complete... Image deep copy: 37% complete... Image deep copy: 39% complete... Image deep copy: 40% complete... Image deep copy: 42% complete... Image deep copy: 43% complete... Image deep copy: 45% complete... Image deep copy: 46% complete... Image deep copy: 48% complete... Image deep copy: 50% complete... Image deep copy: 51% complete... Image deep copy: 53% complete... Image deep copy: 54% complete... Image deep copy: 56% complete... Image deep copy: 57% complete... Image deep copy: 59% complete... Image deep copy: 60% complete... Image deep copy: 62% complete... Image deep copy: 64% complete... Image deep copy: 65% complete... Image deep copy: 67% complete... Image deep copy: 68% complete... Image deep copy: 70% complete... Image deep copy: 71% complete... Image deep copy: 73% complete... Image deep copy: 75% complete... Image deep copy: 76% complete... Image deep copy: 78% complete... Image deep copy: 79% complete... Image deep copy: 81% complete... Image deep copy: 82% complete... Image deep copy: 84% complete... Image deep copy: 85% complete... Image deep copy: 87% complete... Image deep copy: 89% complete... Image deep copy: 90% complete... Image deep copy: 92% complete... Image deep copy: 93% complete... Image deep copy: 95% complete... Image deep copy: 96% complete... Image deep copy: 98% complete... Image deep copy: 100% complete...2026-03-23T18:20:09.338+0000 7f154952b640 0 -- 192.168.123.104:0/3970489344 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7f152805ca90 msgr2=0x7f152807ce90 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:20:09.344 INFO:tasks.workunit.client.0.vm04.stderr: Image deep copy: 100% complete...done. 2026-03-23T18:20:09.350 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg3 2026-03-23T18:20:09.350 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 256 MiB' 2026-03-23T18:20:09.389 INFO:tasks.workunit.client.0.vm04.stdout: size 256 MiB in 64 objects 2026-03-23T18:20:09.389 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg3 2026-03-23T18:20:09.389 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'parent: rbd/testimg1@snap1' 2026-03-23T18:20:09.429 INFO:tasks.workunit.client.0.vm04.stdout: parent: rbd/testimg1@snap1 2026-03-23T18:20:09.429 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap ls testimg3 2026-03-23T18:20:09.429 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -v SNAPID 2026-03-23T18:20:09.429 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:20:09.429 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 1 2026-03-23T18:20:09.465 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-23T18:20:09.465 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap ls testimg3 2026-03-23T18:20:09.465 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '.*snap2.*' 2026-03-23T18:20:09.500 INFO:tasks.workunit.client.0.vm04.stdout: 42 snap2 256 MiB Mon Mar 23 18:20:09 2026 2026-03-23T18:20:09.500 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg2 2026-03-23T18:20:09.500 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'features:.*deep-flatten' 2026-03-23T18:20:09.738 INFO:tasks.workunit.client.0.vm04.stdout: features: layering, exclusive-lock, object-map, fast-diff, deep-flatten, data-pool 2026-03-23T18:20:09.738 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg3 2026-03-23T18:20:09.738 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'features:.*deep-flatten' 2026-03-23T18:20:09.776 INFO:tasks.workunit.client.0.vm04.stdout: features: layering, exclusive-lock, object-map, fast-diff, deep-flatten, data-pool 2026-03-23T18:20:09.776 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd flatten testimg2 2026-03-23T18:20:09.820 INFO:tasks.workunit.client.0.vm04.stderr: Image flatten: 1% complete... Image flatten: 3% complete... Image flatten: 4% complete... Image flatten: 6% complete... Image flatten: 7% complete... Image flatten: 9% complete... Image flatten: 10% complete... Image flatten: 12% complete... Image flatten: 14% complete... Image flatten: 15% complete... Image flatten: 17% complete... Image flatten: 18% complete... Image flatten: 20% complete... Image flatten: 21% complete... Image flatten: 23% complete... Image flatten: 25% complete... Image flatten: 26% complete... Image flatten: 28% complete... Image flatten: 29% complete... Image flatten: 31% complete... Image flatten: 32% complete... Image flatten: 34% complete... Image flatten: 35% complete... Image flatten: 37% complete... Image flatten: 39% complete... Image flatten: 40% complete... Image flatten: 42% complete... Image flatten: 43% complete... Image flatten: 45% complete... Image flatten: 46% complete... Image flatten: 48% complete... Image flatten: 50% complete... Image flatten: 51% complete... Image flatten: 53% complete... Image flatten: 54% complete... Image flatten: 56% complete... Image flatten: 57% complete... Image flatten: 59% complete... Image flatten: 60% complete... Image flatten: 62% complete... Image flatten: 64% complete... Image flatten: 65% complete... Image flatten: 67% complete... Image flatten: 68% complete... Image flatten: 70% complete... Image flatten: 71% complete... Image flatten: 73% complete... Image flatten: 75% complete... Image flatten: 76% complete... Image flatten: 78% complete... Image flatten: 79% complete... Image flatten: 81% complete... Image flatten: 82% complete... Image flatten: 84% complete... Image flatten: 85% complete... Image flatten: 87% complete... Image flatten: 89% complete... Image flatten: 90% complete... Image flatten: 92% complete... Image flatten: 93% complete... Image flatten: 95% complete... Image flatten: 96% complete... Image flatten: 98% complete... Image flatten: 100% complete...done. 2026-03-23T18:20:09.829 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd flatten testimg3 2026-03-23T18:20:09.873 INFO:tasks.workunit.client.0.vm04.stderr: Image flatten: 1% complete... Image flatten: 3% complete... Image flatten: 4% complete... Image flatten: 6% complete... Image flatten: 7% complete... Image flatten: 9% complete... Image flatten: 10% complete... Image flatten: 12% complete... Image flatten: 14% complete... Image flatten: 15% complete... Image flatten: 17% complete... Image flatten: 18% complete... Image flatten: 20% complete... Image flatten: 21% complete... Image flatten: 23% complete... Image flatten: 25% complete... Image flatten: 26% complete... Image flatten: 28% complete... Image flatten: 29% complete... Image flatten: 31% complete... Image flatten: 32% complete... Image flatten: 34% complete... Image flatten: 35% complete... Image flatten: 37% complete... Image flatten: 39% complete... Image flatten: 40% complete... Image flatten: 42% complete... Image flatten: 43% complete... Image flatten: 45% complete... Image flatten: 46% complete... Image flatten: 48% complete... Image flatten: 50% complete... Image flatten: 51% complete... Image flatten: 53% complete... Image flatten: 54% complete... Image flatten: 56% complete... Image flatten: 57% complete... Image flatten: 59% complete... Image flatten: 60% complete... Image flatten: 62% complete... Image flatten: 64% complete... Image flatten: 65% complete... Image flatten: 67% complete... Image flatten: 68% complete... Image flatten: 70% complete... Image flatten: 71% complete... Image flatten: 73% complete... Image flatten: 75% complete... Image flatten: 76% complete... Image flatten: 78% complete... Image flatten: 79% complete... Image flatten: 81% complete... Image flatten: 82% complete... Image flatten: 84% complete... Image flatten: 85% complete... Image flatten: 87% complete... Image flatten: 89% complete... Image flatten: 90% complete... Image flatten: 92% complete... Image flatten: 93% complete... Image flatten: 95% complete... Image flatten: 96% complete... Image flatten: 98% complete... Image flatten: 100% complete...done. 2026-03-23T18:20:09.883 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap unprotect testimg1@snap1 2026-03-23T18:20:09.923 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap purge testimg2 2026-03-23T18:20:10.287 INFO:tasks.workunit.client.0.vm04.stderr: Removing all snapshots: 100% complete... Removing all snapshots: 100% complete...done. 2026-03-23T18:20:10.297 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap purge testimg3 2026-03-23T18:20:11.336 INFO:tasks.workunit.client.0.vm04.stderr: Removing all snapshots: 100% complete... Removing all snapshots: 100% complete...done. 2026-03-23T18:20:11.347 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm testimg2 2026-03-23T18:20:11.444 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 1% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 12% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 23% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 37% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 48% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 62% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 73% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 87% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 98% complete...2026-03-23T18:20:11.442+0000 7fa5ba724640 0 -- 192.168.123.104:0/2552500365 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x564c83a78f20 msgr2=0x564c83aaa6c0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:20:11.450 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:20:11.455 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm testimg3 2026-03-23T18:20:11.545 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 1% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 12% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 23% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 37% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 48% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 62% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 73% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 87% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 98% complete...2026-03-23T18:20:11.542+0000 7fa9df6af640 0 -- 192.168.123.104:0/28763497 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x560e028bdbf0 msgr2=0x560e028f1040 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:20:11.552 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:20:11.556 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap protect testimg1@snap1 2026-03-23T18:20:11.599 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone testimg1@snap1 testimg2 2026-03-23T18:20:11.653 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create testimg2@snap2 2026-03-23T18:20:12.375 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-23T18:20:12.385 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd deep copy --flatten testimg2 testimg3 2026-03-23T18:20:13.436 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:20:13.434+0000 7f5175224640 0 -- 192.168.123.104:0/4091396396 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x557dac32b3a0 msgr2=0x557dac3ba820 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:20:13.523 INFO:tasks.workunit.client.0.vm04.stderr: Image deep copy: 1% complete... Image deep copy: 3% complete... Image deep copy: 4% complete... Image deep copy: 6% complete... Image deep copy: 7% complete... Image deep copy: 9% complete... Image deep copy: 10% complete... Image deep copy: 12% complete... Image deep copy: 14% complete...2026-03-23T18:20:13.522+0000 7f5173f9b640 0 -- 192.168.123.104:0/4091396396 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x7f51500047b0 msgr2=0x7f5150025100 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:20:14.373 INFO:tasks.workunit.client.0.vm04.stderr: Image deep copy: 15% complete... Image deep copy: 17% complete... Image deep copy: 18% complete... Image deep copy: 20% complete... Image deep copy: 21% complete... Image deep copy: 23% complete... Image deep copy: 25% complete... Image deep copy: 26% complete... Image deep copy: 28% complete... Image deep copy: 29% complete... Image deep copy: 31% complete... Image deep copy: 32% complete... Image deep copy: 34% complete... Image deep copy: 35% complete... Image deep copy: 37% complete... Image deep copy: 39% complete... Image deep copy: 40% complete... Image deep copy: 42% complete... Image deep copy: 43% complete... Image deep copy: 45% complete... Image deep copy: 46% complete... Image deep copy: 48% complete... Image deep copy: 50% complete... Image deep copy: 51% complete... Image deep copy: 53% complete... Image deep copy: 54% complete... Image deep copy: 56% complete... Image deep copy: 57% complete... Image deep copy: 59% complete... Image deep copy: 60% complete... Image deep copy: 62% complete... Image deep copy: 64% complete... Image deep copy: 65% complete... Image deep copy: 67% complete... Image deep copy: 68% complete... Image deep copy: 70% complete... Image deep copy: 71% complete... Image deep copy: 73% complete... Image deep copy: 75% complete... Image deep copy: 76% complete... Image deep copy: 78% complete... Image deep copy: 79% complete... Image deep copy: 81% complete... Image deep copy: 82% complete... Image deep copy: 84% complete... Image deep copy: 85% complete... Image deep copy: 87% complete... Image deep copy: 89% complete... Image deep copy: 90% complete... Image deep copy: 92% complete... Image deep copy: 93% complete... Image deep copy: 95% complete... Image deep copy: 96% complete... Image deep copy: 98% complete... Image deep copy: 100% complete... Image deep copy: 100% complete...done. 2026-03-23T18:20:14.380 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg3 2026-03-23T18:20:14.380 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 256 MiB' 2026-03-23T18:20:14.411 INFO:tasks.workunit.client.0.vm04.stdout: size 256 MiB in 64 objects 2026-03-23T18:20:14.411 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg3 2026-03-23T18:20:14.411 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -v parent: 2026-03-23T18:20:14.445 INFO:tasks.workunit.client.0.vm04.stdout:rbd image 'testimg3': 2026-03-23T18:20:14.445 INFO:tasks.workunit.client.0.vm04.stdout: size 256 MiB in 64 objects 2026-03-23T18:20:14.445 INFO:tasks.workunit.client.0.vm04.stdout: order 22 (4 MiB objects) 2026-03-23T18:20:14.445 INFO:tasks.workunit.client.0.vm04.stdout: snapshot_count: 1 2026-03-23T18:20:14.446 INFO:tasks.workunit.client.0.vm04.stdout: id: 26f36ff90f7b 2026-03-23T18:20:14.446 INFO:tasks.workunit.client.0.vm04.stdout: data_pool: datapool 2026-03-23T18:20:14.446 INFO:tasks.workunit.client.0.vm04.stdout: block_name_prefix: rbd_data.2.26f36ff90f7b 2026-03-23T18:20:14.446 INFO:tasks.workunit.client.0.vm04.stdout: format: 2 2026-03-23T18:20:14.446 INFO:tasks.workunit.client.0.vm04.stdout: features: layering, exclusive-lock, object-map, fast-diff, deep-flatten, data-pool 2026-03-23T18:20:14.446 INFO:tasks.workunit.client.0.vm04.stdout: op_features: 2026-03-23T18:20:14.446 INFO:tasks.workunit.client.0.vm04.stdout: flags: 2026-03-23T18:20:14.446 INFO:tasks.workunit.client.0.vm04.stdout: create_timestamp: Mon Mar 23 18:20:12 2026 2026-03-23T18:20:14.446 INFO:tasks.workunit.client.0.vm04.stdout: access_timestamp: Mon Mar 23 18:20:12 2026 2026-03-23T18:20:14.446 INFO:tasks.workunit.client.0.vm04.stdout: modify_timestamp: Mon Mar 23 18:20:12 2026 2026-03-23T18:20:14.446 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap ls testimg3 2026-03-23T18:20:14.446 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -v SNAPID 2026-03-23T18:20:14.446 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:20:14.446 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 1 2026-03-23T18:20:14.477 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-23T18:20:14.477 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap ls testimg3 2026-03-23T18:20:14.477 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '.*snap2.*' 2026-03-23T18:20:14.510 INFO:tasks.workunit.client.0.vm04.stdout: 44 snap2 256 MiB Mon Mar 23 18:20:13 2026 2026-03-23T18:20:14.510 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg2 2026-03-23T18:20:14.510 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'features:.*deep-flatten' 2026-03-23T18:20:14.550 INFO:tasks.workunit.client.0.vm04.stdout: features: layering, exclusive-lock, object-map, fast-diff, deep-flatten, data-pool 2026-03-23T18:20:14.550 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd flatten testimg2 2026-03-23T18:20:14.594 INFO:tasks.workunit.client.0.vm04.stderr: Image flatten: 1% complete... Image flatten: 3% complete... Image flatten: 4% complete... Image flatten: 6% complete... Image flatten: 7% complete... Image flatten: 9% complete... Image flatten: 10% complete... Image flatten: 12% complete... Image flatten: 14% complete... Image flatten: 15% complete... Image flatten: 17% complete... Image flatten: 18% complete... Image flatten: 20% complete... Image flatten: 21% complete... Image flatten: 23% complete... Image flatten: 25% complete... Image flatten: 26% complete... Image flatten: 28% complete... Image flatten: 29% complete... Image flatten: 31% complete... Image flatten: 32% complete... Image flatten: 34% complete... Image flatten: 35% complete... Image flatten: 37% complete... Image flatten: 39% complete... Image flatten: 40% complete... Image flatten: 42% complete... Image flatten: 43% complete... Image flatten: 45% complete... Image flatten: 46% complete... Image flatten: 48% complete... Image flatten: 50% complete... Image flatten: 51% complete... Image flatten: 53% complete... Image flatten: 54% complete... Image flatten: 56% complete... Image flatten: 57% complete... Image flatten: 59% complete... Image flatten: 60% complete... Image flatten: 62% complete... Image flatten: 64% complete... Image flatten: 65% complete... Image flatten: 67% complete... Image flatten: 68% complete... Image flatten: 70% complete... Image flatten: 71% complete... Image flatten: 73% complete... Image flatten: 75% complete... Image flatten: 76% complete... Image flatten: 78% complete... Image flatten: 79% complete... Image flatten: 81% complete... Image flatten: 82% complete... Image flatten: 84% complete... Image flatten: 85% complete... Image flatten: 87% complete... Image flatten: 89% complete... Image flatten: 90% complete... Image flatten: 92% complete... Image flatten: 93% complete... Image flatten: 95% complete... Image flatten: 96% complete... Image flatten: 98% complete... Image flatten: 100% complete...done. 2026-03-23T18:20:14.606 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap unprotect testimg1@snap1 2026-03-23T18:20:14.648 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-23T18:20:14.648 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:15.480 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:16.464 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:17.664 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:17.750 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:17.833 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:17.913 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:17.987 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:18.066 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:18.145 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:18.219 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:18.294 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:18.371 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:18.445 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:18.524 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:18.593 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:18.658 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:18.729 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:18.804 INFO:tasks.workunit.client.0.vm04.stdout:testing clone v2... 2026-03-23T18:20:18.805 INFO:tasks.workunit.client.0.vm04.stderr:+ test_clone_v2 2026-03-23T18:20:18.805 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing clone v2...' 2026-03-23T18:20:18.805 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-23T18:20:18.805 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:18.880 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:18.954 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:19.033 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:19.108 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:19.183 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:19.254 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:19.327 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:19.422 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:19.501 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:19.582 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:19.662 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:19.942 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:20.019 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:20.090 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:20.158 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:20.231 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:20.311 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:20.386 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 -s 1 test1 2026-03-23T18:20:20.428 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create test1@1 2026-03-23T18:20:21.293 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-23T18:20:21.303 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format=1 test1@1 test2 2026-03-23T18:20:21.332 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:20:21.330+0000 7f95fd8e9640 -1 librbd::image::CloneRequest: 0x56099b72ce60 validate_parent: parent snapshot must be protected 2026-03-23T18:20:21.333 INFO:tasks.workunit.client.0.vm04.stderr:rbd: clone error: (22) Invalid argument 2026-03-23T18:20:21.336 INFO:tasks.workunit.client.0.vm04.stderr:+ true 2026-03-23T18:20:21.336 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format=2 test1@1 test2 2026-03-23T18:20:21.394 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd snap ls test1 --format json 2026-03-23T18:20:21.394 INFO:tasks.workunit.client.0.vm04.stderr:++ jq '.[] | select(.name == "1") | .id' 2026-03-23T18:20:21.430 INFO:tasks.workunit.client.0.vm04.stderr:+ SNAP_ID=45 2026-03-23T18:20:21.430 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format=2 --snap-id 45 test1 test3 2026-03-23T18:20:21.492 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap protect test1@1 2026-03-23T18:20:21.533 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format=1 test1@1 test4 2026-03-23T18:20:21.598 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd children test1@1 2026-03-23T18:20:21.598 INFO:tasks.workunit.client.0.vm04.stderr:+ sort 2026-03-23T18:20:21.598 INFO:tasks.workunit.client.0.vm04.stderr:+ tr '\n' ' ' 2026-03-23T18:20:21.598 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -E 'test2.*test3.*test4' 2026-03-23T18:20:21.636 INFO:tasks.workunit.client.0.vm04.stdout:rbd/test2 rbd/test3 rbd/test4 2026-03-23T18:20:21.636 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd children --descendants test1 2026-03-23T18:20:21.636 INFO:tasks.workunit.client.0.vm04.stderr:+ sort 2026-03-23T18:20:21.636 INFO:tasks.workunit.client.0.vm04.stderr:+ tr '\n' ' ' 2026-03-23T18:20:21.636 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -E 'test2.*test3.*test4' 2026-03-23T18:20:21.881 INFO:tasks.workunit.client.0.vm04.stdout:rbd/test2 rbd/test3 rbd/test4 2026-03-23T18:20:21.881 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd remove test4 2026-03-23T18:20:21.971 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:20:21.970+0000 7fea51dd5640 0 -- 192.168.123.104:0/3677873752 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x558c8c58db50 msgr2=0x558c8c5bf680 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:20:21.972 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:20:21.976 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap unprotect test1@1 2026-03-23T18:20:22.016 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap remove test1@1 2026-03-23T18:20:22.054 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-23T18:20:22.064 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap list --all test1 2026-03-23T18:20:22.064 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -E 'trash \(user 1\) *$' 2026-03-23T18:20:22.099 INFO:tasks.workunit.client.0.vm04.stdout: 45 2fd33e98-d674-4deb-9ea7-e58dbaf1a4ef 1 MiB Mon Mar 23 18:20:21 2026 trash (user 1) 2026-03-23T18:20:22.099 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create test1@2 2026-03-23T18:20:22.299 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-23T18:20:22.312 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test1 2026-03-23T18:20:22.312 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'image has snapshots' 2026-03-23T18:20:22.361 INFO:tasks.workunit.client.0.vm04.stdout:rbd: image has snapshots - these must be deleted with 'rbd snap purge' before the image can be removed. 2026-03-23T18:20:22.361 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm test1@2 2026-03-23T18:20:23.300 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-23T18:20:23.310 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test1 2026-03-23T18:20:23.310 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'linked clones' 2026-03-23T18:20:23.363 INFO:tasks.workunit.client.0.vm04.stdout:rbd: image has snapshots with linked clones - these must be deleted or flattened before the image can be removed. 2026-03-23T18:20:23.363 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test3 2026-03-23T18:20:23.481 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:20:23.478+0000 7f570e5e3640 0 -- 192.168.123.104:0/2522865564 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x56201401cbf0 msgr2=0x56201404fb20 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:20:23.494 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:20:23.498 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test1 2026-03-23T18:20:23.498 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'linked clones' 2026-03-23T18:20:23.552 INFO:tasks.workunit.client.0.vm04.stdout:rbd: image has snapshots with linked clones - these must be deleted or flattened before the image can be removed. 2026-03-23T18:20:23.552 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd flatten test2 2026-03-23T18:20:24.313 INFO:tasks.workunit.client.0.vm04.stderr: Image flatten: 100% complete...done. 2026-03-23T18:20:24.324 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap list --all test1 2026-03-23T18:20:24.324 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:20:24.324 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^0$' 2026-03-23T18:20:24.358 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-23T18:20:24.358 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test1 2026-03-23T18:20:24.604 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:20:24.602+0000 7f7c9f743640 0 -- 192.168.123.104:0/640334548 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x555e29134b50 msgr2=0x555e291666a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:20:24.614 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:20:24.618 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test2 2026-03-23T18:20:24.722 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:20:24.718+0000 7ff587d46640 0 -- 192.168.123.104:0/2476667562 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x557d6034eb50 msgr2=0x557d60380670 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:20:24.728 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:20:24.733 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 -s 1 test1 2026-03-23T18:20:24.781 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create test1@1 2026-03-23T18:20:25.318 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-23T18:20:25.330 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create test1@2 2026-03-23T18:20:26.512 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-23T18:20:26.521 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone test1@1 test2 --rbd-default-clone-format 2 2026-03-23T18:20:26.590 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone test1@2 test3 --rbd-default-clone-format 2 2026-03-23T18:20:26.656 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm test1@1 2026-03-23T18:20:26.703 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-23T18:20:26.715 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm test1@2 2026-03-23T18:20:26.755 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-23T18:20:26.766 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd rm test1 2026-03-23T18:20:26.766 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test1 2026-03-23T18:20:26.808 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:20:26.806+0000 7f9c25027200 -1 librbd::api::Image: remove: image has snapshots - not removing 2026-03-23T18:20:26.808 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 0% complete...failed. 2026-03-23T18:20:26.812 INFO:tasks.workunit.client.0.vm04.stderr:rbd: image has snapshots with linked clones - these must be deleted or flattened before the image can be removed. 2026-03-23T18:20:26.816 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:20:26.816 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test1 --rbd-move-parent-to-trash-on-remove=true 2026-03-23T18:20:26.883 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:20:26.887 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls -a 2026-03-23T18:20:26.887 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test1 2026-03-23T18:20:26.916 INFO:tasks.workunit.client.0.vm04.stdout:28255b10e881 test1 2026-03-23T18:20:26.916 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test2 2026-03-23T18:20:27.005 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:20:27.002+0000 7f8402226640 0 -- 192.168.123.104:0/679853240 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x556d62e8eb50 msgr2=0x556d62ec0700 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:20:27.014 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:20:27.010+0000 7f83fbfff640 0 -- 192.168.123.104:0/679853240 >> [v2:192.168.123.104:6816/2446644018,v1:192.168.123.104:6817/2446644018] conn(0x7f83dc007670 msgr2=0x7f83dc003230 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:20:27.533 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:20:27.539 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls -a 2026-03-23T18:20:27.539 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test1 2026-03-23T18:20:27.566 INFO:tasks.workunit.client.0.vm04.stdout:28255b10e881 test1 2026-03-23T18:20:27.566 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test3 2026-03-23T18:20:27.653 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:20:27.650+0000 7f36cf6f9640 0 -- 192.168.123.104:0/3464970878 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55ba96743f20 msgr2=0x55ba967756c0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:20:27.665 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:20:27.662+0000 7f36cf6f9640 0 -- 192.168.123.104:0/3464970878 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7f36ac05c9f0 msgr2=0x7f36ac07cdf0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:20:28.558 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:20:28.563 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls -a 2026-03-23T18:20:28.563 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail grep test1 2026-03-23T18:20:28.563 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test1 2026-03-23T18:20:28.597 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:20:28.597 INFO:tasks.workunit.client.0.vm04.stdout:testing thick provision... 2026-03-23T18:20:28.597 INFO:tasks.workunit.client.0.vm04.stderr:+ test_thick_provision 2026-03-23T18:20:28.597 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing thick provision...' 2026-03-23T18:20:28.597 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-23T18:20:28.597 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:28.681 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:28.766 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:28.851 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:28.933 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:29.018 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:29.103 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:29.191 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:29.335 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:29.420 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:29.566 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:29.741 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:29.903 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:29.983 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:30.066 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:30.139 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:30.217 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:30.300 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:30.381 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --thick-provision -s 64M test1 2026-03-23T18:20:30.746 INFO:tasks.workunit.client.0.vm04.stderr: Thick provisioning: 6% complete... Thick provisioning: 12% complete... Thick provisioning: 18% complete... Thick provisioning: 25% complete... Thick provisioning: 31% complete... Thick provisioning: 37% complete... Thick provisioning: 43% complete... Thick provisioning: 50% complete... Thick provisioning: 56% complete... Thick provisioning: 62% complete... Thick provisioning: 68% complete... Thick provisioning: 75% complete... Thick provisioning: 81% complete... Thick provisioning: 87% complete... Thick provisioning: 93% complete... Thick provisioning: 100% complete... Thick provisioning: 100% complete...done. 2026-03-23T18:20:30.759 INFO:tasks.workunit.client.0.vm04.stderr:+ count=0 2026-03-23T18:20:30.759 INFO:tasks.workunit.client.0.vm04.stderr:+ ret= 2026-03-23T18:20:30.759 INFO:tasks.workunit.client.0.vm04.stderr:+ '[' 0 -lt 10 ']' 2026-03-23T18:20:30.759 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd du 2026-03-23T18:20:30.759 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test1 2026-03-23T18:20:30.759 INFO:tasks.workunit.client.0.vm04.stderr:+ tr -s ' ' 2026-03-23T18:20:30.759 INFO:tasks.workunit.client.0.vm04.stderr:+ cut -d ' ' -f 4-5 2026-03-23T18:20:30.763 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^64 MiB' 2026-03-23T18:20:30.795 INFO:tasks.workunit.client.0.vm04.stdout:64 MiB 2026-03-23T18:20:30.795 INFO:tasks.workunit.client.0.vm04.stderr:+ ret=0 2026-03-23T18:20:30.795 INFO:tasks.workunit.client.0.vm04.stderr:+ '[' 0 = 0 ']' 2026-03-23T18:20:30.795 INFO:tasks.workunit.client.0.vm04.stderr:+ break 2026-03-23T18:20:30.795 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd du 2026-03-23T18:20:30.826 INFO:tasks.workunit.client.0.vm04.stdout:NAME PROVISIONED USED 2026-03-23T18:20:30.826 INFO:tasks.workunit.client.0.vm04.stdout:test1 64 MiB 64 MiB 2026-03-23T18:20:30.830 INFO:tasks.workunit.client.0.vm04.stderr:+ '[' 0 '!=' 0 ']' 2026-03-23T18:20:30.830 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test1 2026-03-23T18:20:30.939 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 6% complete... Removing image: 12% complete... Removing image: 18% complete... Removing image: 25% complete... Removing image: 31% complete... Removing image: 37% complete... Removing image: 43% complete... Removing image: 50% complete... Removing image: 56% complete... Removing image: 62% complete... Removing image: 68% complete... Removing image: 75% complete... Removing image: 81% complete... Removing image: 87% complete... Removing image: 93% complete...2026-03-23T18:20:30.938+0000 7f7398af0640 0 -- 192.168.123.104:0/1542267610 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x563ab67aeb50 msgr2=0x563ab67e0700 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:20:30.977 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:20:30.974+0000 7f7398af0640 0 -- 192.168.123.104:0/1542267610 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7f737805cac0 msgr2=0x7f737807cec0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:20:30.979 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:20:30.985 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-23T18:20:30.985 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test1 2026-03-23T18:20:30.985 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:20:30.985 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^0$' 2026-03-23T18:20:31.012 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-23T18:20:31.012 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --thick-provision -s 4G test1 2026-03-23T18:20:31.902 INFO:tasks.workunit.client.0.vm04.stderr: Thick provisioning: 1% complete... Thick provisioning: 2% complete... Thick provisioning: 3% complete... Thick provisioning: 4% complete...2026-03-23T18:20:31.898+0000 7f4e3de36640 0 -- 192.168.123.104:0/317406958 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55d0984c9ce0 msgr2=0x55d098408660 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:20:32.075 INFO:tasks.workunit.client.0.vm04.stderr: Thick provisioning: 5% complete...2026-03-23T18:20:32.074+0000 7f4e3cbad640 0 -- 192.168.123.104:0/317406958 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x55d09855f3d0 msgr2=0x55d09857f850 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:20:53.997 INFO:tasks.workunit.client.0.vm04.stderr: Thick provisioning: 6% complete... Thick provisioning: 7% complete... Thick provisioning: 8% complete... Thick provisioning: 9% complete... Thick provisioning: 10% complete... Thick provisioning: 11% complete... Thick provisioning: 12% complete... Thick provisioning: 13% complete... Thick provisioning: 14% complete... Thick provisioning: 15% complete... Thick provisioning: 16% complete... Thick provisioning: 17% complete... Thick provisioning: 18% complete... Thick provisioning: 19% complete... Thick provisioning: 20% complete... Thick provisioning: 21% complete... Thick provisioning: 22% complete... Thick provisioning: 23% complete... Thick provisioning: 24% complete... Thick provisioning: 25% complete... Thick provisioning: 26% complete... Thick provisioning: 27% complete... Thick provisioning: 28% complete... Thick provisioning: 29% complete... Thick provisioning: 30% complete... Thick provisioning: 31% complete... Thick provisioning: 32% complete... Thick provisioning: 33% complete... Thick provisioning: 34% complete... Thick provisioning: 35% complete... Thick provisioning: 36% complete... Thick provisioning: 37% complete... Thick provisioning: 38% complete... Thick provisioning: 39% complete... Thick provisioning: 40% complete... Thick provisioning: 41% complete... Thick provisioning: 42% complete... Thick provisioning: 43% complete... Thick provisioning: 44% complete... Thick provisioning: 45% complete... Thick provisioning: 46% complete... Thick provisioning: 47% complete... Thick provisioning: 48% complete... Thick provisioning: 49% complete... Thick provisioning: 50% complete... Thick provisioning: 51% complete... Thick provisioning: 52% complete... Thick provisioning: 53% complete... Thick provisioning: 54% complete... Thick provisioning: 55% complete... Thick provisioning: 56% complete... Thick provisioning: 57% complete... Thick provisioning: 58% complete... Thick provisioning: 59% complete... Thick provisioning: 60% complete... Thick provisioning: 61% complete... Thick provisioning: 62% complete... Thick provisioning: 63% complete... Thick provisioning: 64% complete... Thick provisioning: 65% complete... Thick provisioning: 66% complete... Thick provisioning: 67% complete... Thick provisioning: 68% complete... Thick provisioning: 69% complete... Thick provisioning: 70% complete... Thick provisioning: 71% complete... Thick provisioning: 72% complete... Thick provisioning: 73% complete... Thick provisioning: 74% complete... Thick provisioning: 75% complete... Thick provisioning: 76% complete... Thick provisioning: 77% complete... Thick provisioning: 78% complete... Thick provisioning: 79% complete... Thick provisioning: 80% complete... Thick provisioning: 81% complete... Thick provisioning: 82% complete... Thick provisioning: 83% complete... Thick provisioning: 84% complete... Thick provisioning: 85% complete... Thick provisioning: 86% complete... Thick provisioning: 87% complete... Thick provisioning: 88% complete... Thick provisioning: 89% complete... Thick provisioning: 90% complete... Thick provisioning: 91% complete... Thick provisioning: 92% complete... Thick provisioning: 93% complete... Thick provisioning: 94% complete... Thick provisioning: 95% complete... Thick provisioning: 96% complete... Thick provisioning: 97% complete... Thick provisioning: 98% complete... Thick provisioning: 99% complete... Thick provisioning: 100% complete... Thick provisioning: 100% complete...done. 2026-03-23T18:20:54.009 INFO:tasks.workunit.client.0.vm04.stderr:+ count=0 2026-03-23T18:20:54.010 INFO:tasks.workunit.client.0.vm04.stderr:+ ret= 2026-03-23T18:20:54.010 INFO:tasks.workunit.client.0.vm04.stderr:+ '[' 0 -lt 10 ']' 2026-03-23T18:20:54.010 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd du 2026-03-23T18:20:54.010 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test1 2026-03-23T18:20:54.010 INFO:tasks.workunit.client.0.vm04.stderr:+ tr -s ' ' 2026-03-23T18:20:54.010 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^4 GiB' 2026-03-23T18:20:54.011 INFO:tasks.workunit.client.0.vm04.stderr:+ cut -d ' ' -f 4-5 2026-03-23T18:20:54.039 INFO:tasks.workunit.client.0.vm04.stdout:4 GiB 2026-03-23T18:20:54.040 INFO:tasks.workunit.client.0.vm04.stderr:+ ret=0 2026-03-23T18:20:54.040 INFO:tasks.workunit.client.0.vm04.stderr:+ '[' 0 = 0 ']' 2026-03-23T18:20:54.040 INFO:tasks.workunit.client.0.vm04.stderr:+ break 2026-03-23T18:20:54.040 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd du 2026-03-23T18:20:54.065 INFO:tasks.workunit.client.0.vm04.stdout:NAME PROVISIONED USED 2026-03-23T18:20:54.066 INFO:tasks.workunit.client.0.vm04.stdout:test1 4 GiB 4 GiB 2026-03-23T18:20:54.068 INFO:tasks.workunit.client.0.vm04.stderr:+ '[' 0 '!=' 0 ']' 2026-03-23T18:20:54.068 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test1 2026-03-23T18:20:54.167 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 1% complete...2026-03-23T18:20:54.166+0000 7fee81ea5640 0 -- 192.168.123.104:0/1071140814 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x55f8fa68f650 msgr2=0x55f8fa6afad0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:20:54.204 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 2% complete... Removing image: 3% complete...2026-03-23T18:20:54.202+0000 7fee81ea5640 0 -- 192.168.123.104:0/1071140814 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x7fee6805c960 msgr2=0x7fee6807cd60 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:20:56.580 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 4% complete... Removing image: 5% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 8% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 11% complete... Removing image: 12% complete... Removing image: 13% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 16% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 19% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 22% complete... Removing image: 23% complete... Removing image: 24% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 27% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 30% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 33% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 36% complete... Removing image: 37% complete... Removing image: 38% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 41% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 44% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 47% complete... Removing image: 48% complete... Removing image: 49% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 52% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 55% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 58% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 61% complete... Removing image: 62% complete... Removing image: 63% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 66% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 69% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 72% complete... Removing image: 73% complete... Removing image: 74% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 77% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 80% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 83% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 86% complete... Removing image: 87% complete... Removing image: 88% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 91% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 94% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 97% complete... Removing image: 98% complete... Removing image: 99% complete... Removing image: 100% complete...done. 2026-03-23T18:20:56.584 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-23T18:20:56.584 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test1 2026-03-23T18:20:56.584 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:20:56.584 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^0$' 2026-03-23T18:20:56.607 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-23T18:20:56.607 INFO:tasks.workunit.client.0.vm04.stdout:testing namespace... 2026-03-23T18:20:56.607 INFO:tasks.workunit.client.0.vm04.stderr:+ test_namespace 2026-03-23T18:20:56.607 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing namespace...' 2026-03-23T18:20:56.607 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-23T18:20:56.607 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:56.667 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:56.726 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:56.788 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:56.857 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:56.935 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:57.012 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:57.091 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:57.158 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:57.225 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:57.295 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:57.371 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:57.444 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:57.516 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:57.595 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:57.707 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:57.780 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:57.848 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:20:57.917 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd namespace ls 2026-03-23T18:20:57.917 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:20:57.917 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^0$' 2026-03-23T18:20:57.944 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-23T18:20:57.944 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd namespace create rbd/test1 2026-03-23T18:20:57.976 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd namespace create --pool rbd --namespace test2 2026-03-23T18:20:58.013 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd namespace create --namespace test3 2026-03-23T18:20:58.046 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd namespace create rbd/test3 2026-03-23T18:20:58.046 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd namespace create rbd/test3 2026-03-23T18:20:58.071 INFO:tasks.workunit.client.0.vm04.stderr:rbd: failed to created namespace: 2026-03-23T18:20:58.070+0000 7fa2a0daf200 -1 librbd::api::Namespace: create: failed to add namespace: (17) File exists 2026-03-23T18:20:58.071 INFO:tasks.workunit.client.0.vm04.stderr:(17) File exists 2026-03-23T18:20:58.075 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:20:58.075 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd namespace list 2026-03-23T18:20:58.075 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test 2026-03-23T18:20:58.075 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:20:58.075 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^3$' 2026-03-23T18:20:58.101 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-23T18:20:58.101 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd namespace remove --pool rbd missing 2026-03-23T18:20:58.101 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd namespace remove --pool rbd missing 2026-03-23T18:20:58.118 INFO:tasks.workunit.client.0.vm04.stderr:rbd: namespace name was not specified 2026-03-23T18:20:58.120 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:20:58.120 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 1G rbd/test1/image1 2026-03-23T18:20:58.159 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-pattern rand --io-total 32M --io-size 4K rbd/test1/image1 2026-03-23T18:20:58.194 INFO:tasks.workunit.client.0.vm04.stdout:bench type write io_size 4096 io_threads 16 bytes 33554432 pattern random 2026-03-23T18:20:58.227 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:20:58.226+0000 7f79c48ee640 0 -- 192.168.123.104:0/4129754456 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x563a4f83edb0 msgr2=0x563a4f870de0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:20:58.235 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:20:58.234+0000 7f79c3665640 0 -- 192.168.123.104:0/4129754456 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x563a4f83edb0 msgr2=0x7f79a407d530 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:20:58.436 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:20:58.434+0000 7f79c3665640 0 -- 192.168.123.104:0/4129754456 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x563a4f83edb0 msgr2=0x7f79a407d530 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:20:58.524 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:20:58.522+0000 7f79c48ee640 0 -- 192.168.123.104:0/4129754456 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7f79a405cb00 msgr2=0x7f79a407cee0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:21:00.684 INFO:tasks.workunit.client.0.vm04.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-23T18:21:00.684 INFO:tasks.workunit.client.0.vm04.stdout: 2 2336 945.336 3.7 MiB/s 2026-03-23T18:21:01.373 INFO:tasks.workunit.client.0.vm04.stdout: 3 2352 745.591 2.9 MiB/s 2026-03-23T18:21:01.837 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:21:01.834+0000 7f79c3665640 0 -- 192.168.123.104:0/4129754456 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x7f79a00049c0 msgr2=0x7f79a407da70 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:21:01.934 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:21:01.930+0000 7f79c48ee640 0 -- 192.168.123.104:0/4129754456 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x563a4f83edb0 msgr2=0x7f79a407d530 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:21:02.255 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:21:02.250+0000 7f79c48ee640 0 -- 192.168.123.104:0/4129754456 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x563a4f83edb0 msgr2=0x7f79a40cc010 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:21:02.431 INFO:tasks.workunit.client.0.vm04.stdout: 4 4096 970.726 3.8 MiB/s 2026-03-23T18:21:03.134 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:21:03.130+0000 7f79c3665640 0 -- 192.168.123.104:0/4129754456 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x7f79a405cb00 msgr2=0x7f79a407cee0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:21:03.273 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:21:03.270+0000 7f79c3665640 0 -- 192.168.123.104:0/4129754456 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x7f79a00049c0 msgr2=0x7f79a415def0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:21:04.228 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:21:04.226+0000 7f79c48ee640 0 -- 192.168.123.104:0/4129754456 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x563a4f83edb0 msgr2=0x7f79a415d9b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:21:04.986 INFO:tasks.workunit.client.0.vm04.stdout: 6 4320 638.773 2.5 MiB/s 2026-03-23T18:21:05.201 INFO:tasks.workunit.client.0.vm04.stdout: 7 4528 648.771 2.5 MiB/s 2026-03-23T18:21:05.851 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:21:05.850+0000 7f79c3665640 0 -- 192.168.123.104:0/4129754456 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x7f79a40c7110 msgr2=0x7f79a4154c30 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:21:05.852 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:21:05.850+0000 7f79c3665640 0 -- 192.168.123.104:0/4129754456 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x563a4f83edb0 msgr2=0x7f79a40cb1d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:21:05.853 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:21:05.850+0000 7f79c3665640 0 -- 192.168.123.104:0/4129754456 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x7f79a40cdcd0 msgr2=0x7f79a40cb710 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:21:05.872 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:21:05.870+0000 7f79c48ee640 0 -- 192.168.123.104:0/4129754456 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7f79a40c3b20 msgr2=0x7f79a40cb710 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:21:06.805 INFO:tasks.workunit.client.0.vm04.stdout: 8 6448 671.894 2.6 MiB/s 2026-03-23T18:21:07.208 INFO:tasks.workunit.client.0.vm04.stdout: 9 7360 858.121 3.4 MiB/s 2026-03-23T18:21:08.041 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:21:08.038+0000 7f79c3665640 0 -- 192.168.123.104:0/4129754456 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x7f79a405cb00 msgr2=0x7f79a407cee0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:21:08.202 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:21:08.198+0000 7f79c3665640 0 -- 192.168.123.104:0/4129754456 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x7f79a40c3b20 msgr2=0x7f79a40d4430 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:21:09.713 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:21:09.710+0000 7f79c48ee640 0 -- 192.168.123.104:0/4129754456 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x563a4f83edb0 msgr2=0x7f79a40d4ac0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:21:09.937 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:21:09.934+0000 7f79c3665640 0 -- 192.168.123.104:0/4129754456 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x7f79a405cb00 msgr2=0x7f79a407cee0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:21:10.707 INFO:tasks.workunit.client.0.vm04.stdout: 12 7664 431.126 1.7 MiB/s 2026-03-23T18:21:12.183 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:21:12.182+0000 7f79c3665640 0 -- 192.168.123.104:0/4129754456 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x563a4f83edb0 msgr2=0x7f79a40d4400 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:21:12.194 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:21:12.190+0000 7f79c3665640 0 -- 192.168.123.104:0/4129754456 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x7f79a40c3b20 msgr2=0x7f79a40c3ef0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:21:12.871 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:21:12.870+0000 7f79c48ee640 0 -- 192.168.123.104:0/4129754456 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7f79a405cb00 msgr2=0x7f79a407cee0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:21:13.757 INFO:tasks.workunit.client.0.vm04.stdout:elapsed: 15 ops: 8192 ops/sec: 526.342 bytes/sec: 2.1 MiB/s 2026-03-23T18:21:13.770 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create rbd/test1/image1@1 2026-03-23T18:21:14.546 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-23T18:21:14.555 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format 2 rbd/test1/image1@1 rbd/test2/image1 2026-03-23T18:21:14.614 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm rbd/test1/image1@1 2026-03-23T18:21:14.650 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-23T18:21:14.661 INFO:tasks.workunit.client.0.vm04.stderr:+ cmp /dev/fd/63 /dev/fd/62 2026-03-23T18:21:14.661 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd export rbd/test1/image1 - 2026-03-23T18:21:14.661 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd export rbd/test2/image1 - 2026-03-23T18:21:15.350 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 1% complete... Exporting image: 2% complete... Exporting image: 3% complete... Exporting image: 1% complete... Exporting image: 2% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 5% complete... Exporting image: 4% complete... Exporting image: 5% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 8% complete... Exporting image: 8% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 11% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 11% complete... Exporting image: 12% complete... Exporting image: 12% complete... Exporting image: 13% complete... Exporting image: 14% complete... Exporting image: 15% complete...2026-03-23T18:21:15.346+0000 7f6a7140e640 0 -- 192.168.123.104:0/2330439584 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x7f6a48004990 msgr2=0x7f6a48024d70 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:21:15.389 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 13% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 16% complete...2026-03-23T18:21:15.386+0000 7f6a7140e640 0 -- 192.168.123.104:0/2330439584 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x7f6a5005cae0 msgr2=0x7f6a5007cee0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:21:15.463 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 16% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 19% complete...2026-03-23T18:21:15.458+0000 7f37ef042640 0 -- 192.168.123.104:0/3944518048 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x7f37cc0047b0 msgr2=0x7f37cc025100 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:21:15.493 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:21:15.490+0000 7f37ef042640 0 -- 192.168.123.104:0/3944518048 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x7f37d005c8c0 msgr2=0x7f37d007ccc0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:21:18.583 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 19% complete... Exporting image: 20% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 22% complete... Exporting image: 23% complete... Exporting image: 21% complete... Exporting image: 22% complete... Exporting image: 23% complete... Exporting image: 24% complete... Exporting image: 24% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 27% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 27% complete... Exporting image: 28% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 30% complete... Exporting image: 31% complete... Exporting image: 29% complete... Exporting image: 30% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 32% complete... Exporting image: 33% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 33% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 36% complete... Exporting image: 37% complete... Exporting image: 36% complete... Exporting image: 37% complete... Exporting image: 38% complete... Exporting image: 39% complete... Exporting image: 38% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 41% complete... Exporting image: 40% complete... Exporting image: 41% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 44% complete... Exporting image: 45% complete... Exporting image: 44% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 47% complete... Exporting image: 46% complete... Exporting image: 47% complete... Exporting image: 48% complete... Exporting image: 49% complete... Exporting image: 50% complete... Exporting image: 48% complete... Exporting image: 49% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 51% complete... Exporting image: 52% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 52% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 55% complete... Exporting image: 56% complete... Exporting image: 55% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 58% complete... Exporting image: 57% complete... Exporting image: 58% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 61% complete... Exporting image: 62% complete... Exporting image: 61% complete... Exporting image: 62% complete... Exporting image: 63% complete... Exporting image: 64% complete... Exporting image: 63% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 66% complete... Exporting image: 67% complete... Exporting image: 65% complete... Exporting image: 66% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 68% complete... Exporting image: 69% complete... Exporting image: 69% complete...Exporting image: 70 % complete...Exporting image: 70% complete... Exporting image: 71 % complete...Exporting image: 71% complete... Exporting image: 72 % complete...Exporting image: 72% complete... Exporting image: 73% complete... Exporting image: 73% complete... Exporting image: 74% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 74% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 77% complete... Exporting image: 77% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 80% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 80% complete... Exporting image: 81% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 83% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 82% complete... Exporting image: 83% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 86% complete... Exporting image: 87% complete... Exporting image: 88% complete... Exporting image: 89% complete... Exporting image: 86% complete... Exporting image: 87% complete... Exporting image: 88% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 91% complete... Exporting image: 92% complete... Exporting image: 90% complete... Exporting image: 91% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 93% complete... Exporting image: 94% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 94% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 97% complete... Exporting image: 98% complete... Exporting image: 97% complete... Exporting image: 98% complete... Exporting image: 99% complete... Exporting image: 99% complete... Exporting image: 100% complete...done. 2026-03-23T18:21:18.585 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 100% complete...done. 2026-03-23T18:21:18.599 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm rbd/test2/image1 2026-03-23T18:21:18.696 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 1% complete... Removing image: 2% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 5% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 8% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 11% complete... Removing image: 12% complete... Removing image: 13% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 16% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 19% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 22% complete... Removing image: 23% complete... Removing image: 24% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 27% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 30% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 33% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 36% complete... Removing image: 37% complete... Removing image: 38% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 41% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 44% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 47% complete... Removing image: 48% complete... Removing image: 49% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 52% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 55% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 58% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 61% complete... Removing image: 62% complete... Removing image: 63% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 66% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 69% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 72% complete... Removing image: 73% complete... Removing image: 74% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 77% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 80% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 83% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 86% complete... Removing image: 87% complete... Removing image: 88% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 91% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 94% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 97% complete... Removing image: 98% complete... Removing image: 99% complete...2026-03-23T18:21:18.694+0000 7fd3b3fff640 0 -- 192.168.123.104:0/2001004383 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7fd398008d30 msgr2=0x7fd3980291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:21:18.707 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:21:18.706+0000 7fd3b3fff640 0 -- 192.168.123.104:0/2001004383 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7fd39405cae0 msgr2=0x7fd39407cee0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:21:19.016 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:21:19.020 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 1G rbd/image2 2026-03-23T18:21:19.062 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-pattern rand --io-total 32M --io-size 4K rbd/image2 2026-03-23T18:21:19.099 INFO:tasks.workunit.client.0.vm04.stdout:bench type write io_size 4096 io_threads 16 bytes 33554432 pattern random 2026-03-23T18:21:19.438 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:21:19.434+0000 7fa7c0b8b640 0 -- 192.168.123.104:0/1661332481 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x7fa798008d30 msgr2=0x7fa7980291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:21:21.807 INFO:tasks.workunit.client.0.vm04.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-23T18:21:21.807 INFO:tasks.workunit.client.0.vm04.stdout: 2 3584 1331.36 5.2 MiB/s 2026-03-23T18:21:21.890 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:21:21.886+0000 7fa7c0b8b640 0 -- 192.168.123.104:0/1661332481 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x7fa798008d30 msgr2=0x7fa7a007d5b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:21:24.559 INFO:tasks.workunit.client.0.vm04.stdout: 5 3600 662.756 2.6 MiB/s 2026-03-23T18:21:24.621 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:21:24.618+0000 7fa7c0b8b640 0 -- 192.168.123.104:0/1661332481 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x55f7dda73a20 msgr2=0x7fa7a0145ea0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:21:24.622 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:21:24.618+0000 7fa7c0b8b640 0 -- 192.168.123.104:0/1661332481 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x7fa798008d30 msgr2=0x7fa7a0146480 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:21:24.623 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:21:24.622+0000 7fa7c0b8b640 0 -- 192.168.123.104:0/1661332481 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x55f7dda73a20 msgr2=0x7fa7a0145e10 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:21:26.263 INFO:tasks.workunit.client.0.vm04.stdout: 7 3616 507.262 2.0 MiB/s 2026-03-23T18:21:27.099 INFO:tasks.workunit.client.0.vm04.stdout: 8 3760 472.235 1.8 MiB/s 2026-03-23T18:21:28.186 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:21:28.182+0000 7fa7c0b8b640 0 -- 192.168.123.104:0/1661332481 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x55f7dda73a20 msgr2=0x7fa7a00c0bd0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:21:29.913 INFO:tasks.workunit.client.0.vm04.stdout: 10 5392 500.369 2.0 MiB/s 2026-03-23T18:21:30.369 INFO:tasks.workunit.client.0.vm04.stdout: 11 5424 214.953 860 KiB/s 2026-03-23T18:21:30.559 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:21:30.554+0000 7fa7c0b8b640 0 -- 192.168.123.104:0/1661332481 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x7fa798008d30 msgr2=0x7fa7a00c13b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:21:31.313 INFO:tasks.workunit.client.0.vm04.stdout: 12 5616 298.578 1.2 MiB/s 2026-03-23T18:21:32.103 INFO:tasks.workunit.client.0.vm04.stdout: 13 6224 446.575 1.7 MiB/s 2026-03-23T18:21:34.050 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:21:34.046+0000 7fa7c0b8b640 0 -- 192.168.123.104:0/1661332481 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x55f7dda73a20 msgr2=0x7fa7a007f8a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:21:34.182 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:21:34.178+0000 7fa7c0b8b640 0 -- 192.168.123.104:0/1661332481 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x7fa798008d30 msgr2=0x7fa7a00c0770 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:21:35.208 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:21:35.206+0000 7fa7c1e14640 0 -- 192.168.123.104:0/1661332481 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7fa798008d30 msgr2=0x7fa7a0111740 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:21:35.209 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:21:35.206+0000 7fa7c1e14640 0 -- 192.168.123.104:0/1661332481 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7fa7a005cb80 msgr2=0x7fa7a007cf60 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:21:37.687 INFO:tasks.workunit.client.0.vm04.stdout: 18 8160 415.564 1.6 MiB/s 2026-03-23T18:21:38.526 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:21:38.522+0000 7fa7c1e14640 0 -- 192.168.123.104:0/1661332481 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7fa798008d30 msgr2=0x7fa7a008abf0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:21:38.541 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:21:38.538+0000 7fa7c1e14640 0 -- 192.168.123.104:0/1661332481 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7fa798008d30 msgr2=0x7fa7a010df20 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:21:40.361 INFO:tasks.workunit.client.0.vm04.stdout:elapsed: 21 ops: 8192 ops/sec: 385.324 bytes/sec: 1.5 MiB/s 2026-03-23T18:21:40.377 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create rbd/image2@1 2026-03-23T18:21:41.004 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-23T18:21:41.016 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format 2 rbd/image2@1 rbd/test2/image2 2026-03-23T18:21:41.071 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm rbd/image2@1 2026-03-23T18:21:41.110 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-23T18:21:41.117 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd export rbd/image2 - 2026-03-23T18:21:41.117 INFO:tasks.workunit.client.0.vm04.stderr:+ cmp /dev/fd/63 /dev/fd/62 2026-03-23T18:21:41.118 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd export rbd/test2/image2 - 2026-03-23T18:21:41.553 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 1% complete... Exporting image: 2% complete... Exporting image: 3% complete... Exporting image: 1% complete... Exporting image: 2% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 5% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 4% complete... Exporting image: 5% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 8% complete... Exporting image: 8% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 11% complete...2026-03-23T18:21:41.550+0000 7f5eabfff640 0 -- 192.168.123.104:0/3283258565 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x5652a08c7fc0 msgr2=0x5652a08e83a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:21:41.739 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 11% complete... Exporting image: 12% complete... Exporting image: 12% complete... Exporting image: 13% complete... Exporting image: 14% complete... Exporting image: 15% complete...2026-03-23T18:21:41.734+0000 7f5eabfff640 0 -- 192.168.123.104:0/3283258565 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x7f5e8c05cae0 msgr2=0x7f5e8c07cee0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:21:41.862 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 16% complete... Exporting image: 13% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 16% complete...2026-03-23T18:21:41.858+0000 7f722cc1a640 0 -- 192.168.123.104:0/1683719194 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x563b82a392b0 msgr2=0x563b82ac8be0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:21:42.060 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 19% complete... Exporting image: 20% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 19% complete... Exporting image: 20% complete...2026-03-23T18:21:42.058+0000 7f722b991640 0 -- 192.168.123.104:0/1683719194 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x7f7204008d30 msgr2=0x7f72040291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:21:45.248 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 21% complete... Exporting image: 22% complete... Exporting image: 23% complete... Exporting image: 24% complete... Exporting image: 21% complete... Exporting image: 22% complete... Exporting image: 23% complete... Exporting image: 24% complete... Exporting image: 25% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 27% complete... Exporting image: 28% complete... Exporting image: 26% complete... Exporting image: 27% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 30% complete... Exporting image: 29% complete... Exporting image: 30% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 33% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 33% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 36% complete... Exporting image: 36% complete... Exporting image: 37% complete... Exporting image: 38% complete... Exporting image: 39% complete... Exporting image: 37% complete... Exporting image: 38% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 41% complete... Exporting image: 42% complete... Exporting image: 40% complete... Exporting image: 41% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 43% complete... Exporting image: 44% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 44% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 47% complete... Exporting image: 47% complete... Exporting image: 48% complete... Exporting image: 49% complete... Exporting image: 50% complete... Exporting image: 48% complete... Exporting image: 49% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 51% complete... Exporting image: 52% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 52% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 55% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 55% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 58% complete... Exporting image: 58% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 61% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 61% complete... Exporting image: 62% complete... Exporting image: 62% complete... Exporting image: 63% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 63% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 66% complete... Exporting image: 66% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 69% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 69% complete... Exporting image: 70% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 72% complete... Exporting image: 73% complete... Exporting image: 71% complete... Exporting image: 72% complete... Exporting image: 73% complete... Exporting image: 74% complete... Exporting image: 74% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 77% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 77% complete... Exporting image: 78% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 80% complete... Exporting image: 81% complete... Exporting image: 79% complete... Exporting image: 80% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 82% complete... Exporting image: 83% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 83% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 86% complete... Exporting image: 87% complete... Exporting image: 86% complete... Exporting image: 87% complete... Exporting image: 88% complete... Exporting image: 89% complete... Exporting image: 88% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 91% complete... Exporting image: 90% complete... Exporting image: 91% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 94% complete... Exporting image: 94% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 97% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 97% complete... Exporting image: 98% complete... Exporting image: 99% complete... Exporting image: 98% complete... Exporting image: 99% complete... Exporting image: 100% complete...done. Exporting image: 100% complete...done. 2026-03-23T18:21:45.248 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-23T18:21:45.266 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd rm rbd/image2 2026-03-23T18:21:45.266 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm rbd/image2 2026-03-23T18:21:45.311 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:21:45.310+0000 7f055826c200 -1 librbd::api::Image: remove: image has snapshots - not removing 2026-03-23T18:21:45.311 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 0% complete...failed. 2026-03-23T18:21:45.315 INFO:tasks.workunit.client.0.vm04.stderr:rbd: image has snapshots with linked clones - these must be deleted or flattened before the image can be removed. 2026-03-23T18:21:45.319 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:21:45.320 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm rbd/test2/image2 2026-03-23T18:21:45.401 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 1% complete... Removing image: 2% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 5% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 8% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 11% complete... Removing image: 12% complete... Removing image: 13% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 16% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 19% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 22% complete... Removing image: 23% complete... Removing image: 24% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 27% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 30% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 33% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 36% complete... Removing image: 37% complete... Removing image: 38% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 41% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 44% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 47% complete... Removing image: 48% complete... Removing image: 49% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 52% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 55% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 58% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 61% complete... Removing image: 62% complete... Removing image: 63% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 66% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 69% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 72% complete... Removing image: 73% complete... Removing image: 74% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 77% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 80% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 83% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 86% complete... Removing image: 87% complete... Removing image: 88% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 91% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 94% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 97% complete... Removing image: 98% complete... Removing image: 99% complete...2026-03-23T18:21:45.398+0000 7f8f1987f640 0 -- 192.168.123.104:0/3396043076 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x7f8ef805ca10 msgr2=0x7f8ef807ce10 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:21:45.412 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:21:45.410+0000 7f8f1987f640 0 -- 192.168.123.104:0/3396043076 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x562a8571fd90 msgr2=0x7f8ef809db80 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:21:46.560 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:21:46.564 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm rbd/image2 2026-03-23T18:21:46.698 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 1% complete... Removing image: 2% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 5% complete...2026-03-23T18:21:46.694+0000 7fbf5043f640 0 -- 192.168.123.104:0/1316328975 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x556bace35f20 msgr2=0x556bace676c0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:21:46.858 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 6% complete... Removing image: 7% complete... Removing image: 8% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 11% complete... Removing image: 12% complete... Removing image: 13% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 16% complete... Removing image: 17% complete...2026-03-23T18:21:46.854+0000 7fbf5043f640 0 -- 192.168.123.104:0/1316328975 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7fbf3405c960 msgr2=0x7fbf3407cd60 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:21:48.099 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 18% complete... Removing image: 19% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 22% complete... Removing image: 23% complete... Removing image: 24% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 27% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 30% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 33% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 36% complete... Removing image: 37% complete... Removing image: 38% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 41% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 44% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 47% complete... Removing image: 48% complete... Removing image: 49% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 52% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 55% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 58% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 61% complete... Removing image: 62% complete... Removing image: 63% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 66% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 69% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 72% complete... Removing image: 73% complete... Removing image: 74% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 77% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 80% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 83% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 86% complete... Removing image: 87% complete... Removing image: 88% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 91% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 94% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 97% complete... Removing image: 98% complete... Removing image: 99% complete... Removing image: 100% complete...done. 2026-03-23T18:21:48.103 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 1G rbd/test1/image3 2026-03-23T18:21:48.144 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create rbd/test1/image3@1 2026-03-23T18:21:48.521 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-23T18:21:48.534 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap protect rbd/test1/image3@1 2026-03-23T18:21:48.576 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format 1 rbd/test1/image3@1 rbd/test1/image4 2026-03-23T18:21:48.635 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm rbd/test1/image4 2026-03-23T18:21:48.716 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 1% complete... Removing image: 2% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 5% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 8% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 11% complete... Removing image: 12% complete... Removing image: 13% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 16% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 19% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 22% complete... Removing image: 23% complete... Removing image: 24% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 27% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 30% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 33% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 36% complete... Removing image: 37% complete... Removing image: 38% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 41% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 44% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 47% complete... Removing image: 48% complete... Removing image: 49% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 52% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 55% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 58% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 61% complete... Removing image: 62% complete... Removing image: 63% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 66% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 69% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 72% complete... Removing image: 73% complete... Removing image: 74% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 77% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 80% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 83% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 86% complete... Removing image: 87% complete... Removing image: 88% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 91% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 94% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 97% complete... Removing image: 98% complete... Removing image: 99% complete...2026-03-23T18:21:48.714+0000 7fca8b5f0640 0 -- 192.168.123.104:0/2804103684 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x5575ba6e7d60 msgr2=0x5575ba719e20 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:21:48.725 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:21:48.729 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap unprotect rbd/test1/image3@1 2026-03-23T18:21:48.771 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm rbd/test1/image3@1 2026-03-23T18:21:49.553 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-23T18:21:49.564 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm rbd/test1/image3 2026-03-23T18:21:49.651 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 1% complete... Removing image: 2% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 5% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 8% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 11% complete... Removing image: 12% complete... Removing image: 13% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 16% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 19% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 22% complete... Removing image: 23% complete... Removing image: 24% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 27% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 30% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 33% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 36% complete... Removing image: 37% complete... Removing image: 38% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 41% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 44% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 47% complete... Removing image: 48% complete... Removing image: 49% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 52% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 55% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 58% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 61% complete... Removing image: 62% complete... Removing image: 63% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 66% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 69% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 72% complete... Removing image: 73% complete... Removing image: 74% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 77% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 80% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 83% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 86% complete... Removing image: 87% complete... Removing image: 88% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 91% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 94% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 97% complete... Removing image: 98% complete... Removing image: 99% complete...2026-03-23T18:21:49.650+0000 7fad58da3640 0 -- 192.168.123.104:0/1013195266 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x5613753eed60 msgr2=0x561375420f40 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:21:49.660 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:21:49.664 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 1G --namespace test1 image2 2026-03-23T18:21:49.710 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd namespace remove rbd/test1 2026-03-23T18:21:49.710 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd namespace remove rbd/test1 2026-03-23T18:21:49.739 INFO:tasks.workunit.client.0.vm04.stderr:rbd: namespace contains images which must be deleted first. 2026-03-23T18:21:49.743 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:21:49.743 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd group create rbd/test1/group1 2026-03-23T18:21:49.775 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd group image add rbd/test1/group1 rbd/test1/image1 2026-03-23T18:21:49.815 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd group image add --group-pool rbd --group-namespace test1 --group group1 --image-pool rbd --image-namespace test1 --image image2 2026-03-23T18:21:49.870 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd group image rm --group-pool rbd --group-namespace test1 --group group1 --image-pool rbd --image-namespace test1 --image image1 2026-03-23T18:21:49.919 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd group image rm rbd/test1/group1 rbd/test1/image2 2026-03-23T18:21:49.961 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd group rm rbd/test1/group1 2026-03-23T18:21:49.995 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash move rbd/test1/image1 2026-03-23T18:21:50.053 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash --namespace test1 ls 2026-03-23T18:21:50.053 INFO:tasks.workunit.client.0.vm04.stderr:++ cut -d ' ' -f 1 2026-03-23T18:21:50.081 INFO:tasks.workunit.client.0.vm04.stderr:+ ID=2954f9305965 2026-03-23T18:21:50.081 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash rm rbd/test1/2954f9305965 2026-03-23T18:21:50.454 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 1% complete... Removing image: 2% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 5% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 8% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 11% complete... Removing image: 12% complete... Removing image: 13% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 16% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 19% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 22% complete... Removing image: 23% complete... Removing image: 24% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 27% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 30% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 332026-03-23T18:21:50.450+0000 7f890a11e640 0 -- 192.168.123.104:0/619372296 >> [v2:192.168.123.104:6802/2398517092,v1:192.168.123.104:6804/2398517092] conn(0x7f88e40048d0 msgr2=0x7f88e4026c00 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:21:50.543 INFO:tasks.workunit.client.0.vm04.stderr:% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 36% complete... Removing image: 37% complete... Removing image: 38% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 41% complete...2026-03-23T18:21:50.538+0000 7f890bba8640 0 -- 192.168.123.104:0/619372296 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x5557c0b52080 msgr2=0x5557c0b86240 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:21:51.300 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 42% complete... Removing image: 43% complete... Removing image: 44% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 47% complete... Removing image: 48% complete... Removing image: 49% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 52% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 55% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 58% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 61% complete... Removing image: 62% complete... Removing image: 63% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 66% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 69% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 72% complete... Removing image: 73% complete... Removing image: 74% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 77% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 80% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 83% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 86% complete... Removing image: 87% complete... Removing image: 88% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 91% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 94% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 97% complete... Removing image: 98% complete... Removing image: 99% complete... Removing image: 100% complete...done. 2026-03-23T18:21:51.304 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd remove rbd/test1/image2 2026-03-23T18:21:51.387 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 1% complete... Removing image: 2% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 5% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 8% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 11% complete... Removing image: 12% complete... Removing image: 13% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 16% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 19% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 22% complete... Removing image: 23% complete... Removing image: 24% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 27% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 30% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 33% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 36% complete... Removing image: 37% complete... Removing image: 38% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 41% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 44% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 47% complete... Removing image: 48% complete... Removing image: 49% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 52% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 55% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 58% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 61% complete... Removing image: 62% complete... Removing image: 63% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 66% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 69% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 72% complete... Removing image: 73% complete... Removing image: 74% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 77% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 80% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 83% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 86% complete... Removing image: 87% complete... Removing image: 88% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 91% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 94% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 97% complete... Removing image: 98% complete... Removing image: 99% complete...2026-03-23T18:21:51.386+0000 7f6a59fd4640 0 -- 192.168.123.104:0/3670039274 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7f6a3805cb10 msgr2=0x7f6a3807cf10 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:21:51.402 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:21:51.407 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd namespace remove --pool rbd --namespace test1 2026-03-23T18:21:51.451 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd namespace remove --namespace test3 2026-03-23T18:21:51.500 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd namespace list 2026-03-23T18:21:51.501 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test 2026-03-23T18:21:51.501 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:21:51.501 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^1$' 2026-03-23T18:21:51.526 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-23T18:21:51.526 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd namespace remove rbd/test2 2026-03-23T18:21:51.570 INFO:tasks.workunit.client.0.vm04.stderr:+ test_trash_purge_schedule 2026-03-23T18:21:51.570 INFO:tasks.workunit.client.0.vm04.stdout:testing trash purge schedule... 2026-03-23T18:21:51.570 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing trash purge schedule...' 2026-03-23T18:21:51.570 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-23T18:21:51.570 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:21:51.856 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:21:51.937 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:21:52.012 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:21:52.114 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:21:52.194 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:21:52.273 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:21:52.348 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:21:52.426 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:21:52.504 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:21:52.574 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:21:52.650 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:21:52.731 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:21:52.813 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:21:52.893 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:21:52.966 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:21:53.027 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:21:53.085 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:21:53.155 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool create rbd2 8 2026-03-23T18:21:54.045 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd2' already exists 2026-03-23T18:21:54.065 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd pool init rbd2 2026-03-23T18:21:58.003 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd namespace create rbd2/ns1 2026-03-23T18:21:58.032 INFO:tasks.workunit.client.0.vm04.stderr:++ ceph rbd trash purge schedule list 2026-03-23T18:21:58.327 INFO:tasks.workunit.client.0.vm04.stderr:+ test '{}' = '{}' 2026-03-23T18:21:58.327 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph rbd trash purge schedule status 2026-03-23T18:21:58.327 INFO:tasks.workunit.client.0.vm04.stderr:+ fgrep '"scheduled": []' 2026-03-23T18:21:58.608 INFO:tasks.workunit.client.0.vm04.stdout: "scheduled": [] 2026-03-23T18:21:58.608 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd trash purge schedule ls 2026-03-23T18:21:58.608 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls 2026-03-23T18:21:58.828 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:21:58.826+0000 7f93ae316640 0 --2- 192.168.123.104:0/722192213 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x56134b9b4530 0x56134ba56860 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 2026-03-23T18:21:58.834 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:21:58.834 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash purge schedule ls -R --format json 2026-03-23T18:21:58.861 INFO:tasks.workunit.client.0.vm04.stderr:+ test '[]' = '[]' 2026-03-23T18:21:58.861 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd trash purge schedule remove dummy 2026-03-23T18:21:58.861 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule remove dummy 2026-03-23T18:21:58.889 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-23T18:21:58.886+0000 7f9fbb311640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-23T18:21:58.890 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd trash purge schedule remove failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-23T18:21:58.890 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:21:58.890 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd trash purge schedule remove 1d dummy 2026-03-23T18:21:58.890 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule remove 1d dummy 2026-03-23T18:21:58.913 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-23T18:21:58.910+0000 7f9fbb311640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-23T18:21:58.913 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd trash purge schedule remove failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-23T18:21:58.916 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:21:58.916 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd trash purge schedule remove -p rbd dummy 2026-03-23T18:21:58.916 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule remove -p rbd dummy 2026-03-23T18:21:58.939 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-23T18:21:58.938+0000 7f9fbb311640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-23T18:21:58.939 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd trash purge schedule remove failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-23T18:21:58.942 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:21:58.942 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd trash purge schedule remove -p rbd 1d dummy 2026-03-23T18:21:58.942 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule remove -p rbd 1d dummy 2026-03-23T18:21:58.964 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-23T18:21:58.962+0000 7f9fbb311640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-23T18:21:58.964 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd trash purge schedule remove failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-23T18:21:58.967 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:21:58.967 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule add -p rbd 1d 01:30 2026-03-23T18:21:59.068 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls -p rbd 2026-03-23T18:21:59.106 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 1d starting at 01:30' 2026-03-23T18:21:59.172 INFO:tasks.workunit.client.0.vm04.stdout:every 1d starting at 01:30:00 2026-03-23T18:21:59.172 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd trash purge schedule ls 2026-03-23T18:21:59.172 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls 2026-03-23T18:21:59.195 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:21:59.195 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls -R 2026-03-23T18:21:59.195 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 1d starting at 01:30' 2026-03-23T18:21:59.218 INFO:tasks.workunit.client.0.vm04.stdout:rbd - every 1d starting at 01:30:00 2026-03-23T18:21:59.218 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls -R -p rbd 2026-03-23T18:21:59.219 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 1d starting at 01:30' 2026-03-23T18:21:59.243 INFO:tasks.workunit.client.0.vm04.stdout:rbd - every 1d starting at 01:30:00 2026-03-23T18:21:59.243 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd trash purge schedule ls -p rbd2 2026-03-23T18:21:59.243 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls -p rbd2 2026-03-23T18:21:59.267 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:21:59.267 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash purge schedule ls -p rbd2 -R --format json 2026-03-23T18:21:59.293 INFO:tasks.workunit.client.0.vm04.stderr:+ test '[]' = '[]' 2026-03-23T18:21:59.296 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule add -p rbd2/ns1 2d 2026-03-23T18:21:59.353 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash purge schedule ls -p rbd2 -R --format json 2026-03-23T18:21:59.382 INFO:tasks.workunit.client.0.vm04.stderr:+ test '[{"pool":"rbd2","namespace":"ns1","items":[{"interval":"2d","start_time":""}]}]' '!=' '[]' 2026-03-23T18:21:59.382 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls -p rbd2 -R 2026-03-23T18:21:59.382 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'rbd2 *ns1 *every 2d' 2026-03-23T18:21:59.409 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 ns1 every 2d 2026-03-23T18:21:59.409 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule rm -p rbd2/ns1 2026-03-23T18:21:59.444 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash purge schedule ls -p rbd2 -R --format json 2026-03-23T18:21:59.470 INFO:tasks.workunit.client.0.vm04.stderr:+ test '[]' = '[]' 2026-03-23T18:21:59.470 INFO:tasks.workunit.client.0.vm04.stderr:++ seq 12 2026-03-23T18:21:59.471 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:21:59.471 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash purge schedule status --format xml 2026-03-23T18:21:59.471 INFO:tasks.workunit.client.0.vm04.stderr:++ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-23T18:21:59.497 INFO:tasks.workunit.client.0.vm04.stderr:+ test '' = rbd 2026-03-23T18:21:59.497 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:22:09.499 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:22:09.499 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash purge schedule status --format xml 2026-03-23T18:22:09.499 INFO:tasks.workunit.client.0.vm04.stderr:++ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-23T18:22:09.523 INFO:tasks.workunit.client.0.vm04.stderr:+ test '' = rbd 2026-03-23T18:22:09.523 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:22:19.525 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:22:19.525 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash purge schedule status --format xml 2026-03-23T18:22:19.525 INFO:tasks.workunit.client.0.vm04.stderr:++ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-23T18:22:19.549 INFO:tasks.workunit.client.0.vm04.stderr:+ test '' = rbd 2026-03-23T18:22:19.549 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:22:29.550 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:22:29.551 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash purge schedule status --format xml 2026-03-23T18:22:29.551 INFO:tasks.workunit.client.0.vm04.stderr:++ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-23T18:22:29.576 INFO:tasks.workunit.client.0.vm04.stderr:+ test '' = rbd 2026-03-23T18:22:29.576 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:22:39.577 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:22:39.578 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash purge schedule status --format xml 2026-03-23T18:22:39.578 INFO:tasks.workunit.client.0.vm04.stderr:++ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-23T18:22:39.602 INFO:tasks.workunit.client.0.vm04.stderr:+ test '' = rbd 2026-03-23T18:22:39.602 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:22:49.604 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:22:49.604 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash purge schedule status --format xml 2026-03-23T18:22:49.604 INFO:tasks.workunit.client.0.vm04.stderr:++ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-23T18:22:49.631 INFO:tasks.workunit.client.0.vm04.stderr:+ test rbd = rbd 2026-03-23T18:22:49.631 INFO:tasks.workunit.client.0.vm04.stderr:+ break 2026-03-23T18:22:49.631 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule status 2026-03-23T18:22:49.654 INFO:tasks.workunit.client.0.vm04.stdout:POOL NAMESPACE SCHEDULE TIME 2026-03-23T18:22:49.655 INFO:tasks.workunit.client.0.vm04.stdout:rbd 2026-03-24 01:30:00 2026-03-23T18:22:49.659 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash purge schedule status --format xml 2026-03-23T18:22:49.659 INFO:tasks.workunit.client.0.vm04.stderr:++ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-23T18:22:49.688 INFO:tasks.workunit.client.0.vm04.stderr:+ test rbd = rbd 2026-03-23T18:22:49.689 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash purge schedule status -p rbd --format xml 2026-03-23T18:22:49.689 INFO:tasks.workunit.client.0.vm04.stderr:++ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-23T18:22:49.720 INFO:tasks.workunit.client.0.vm04.stderr:+ test rbd = rbd 2026-03-23T18:22:49.720 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule add 2d 00:17 2026-03-23T18:22:49.754 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls 2026-03-23T18:22:49.754 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 2d starting at 00:17' 2026-03-23T18:22:49.777 INFO:tasks.workunit.client.0.vm04.stdout:every 2d starting at 00:17:00 2026-03-23T18:22:49.777 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls -R 2026-03-23T18:22:49.777 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 2d starting at 00:17' 2026-03-23T18:22:49.799 INFO:tasks.workunit.client.0.vm04.stdout:- - every 2d starting at 00:17:00 2026-03-23T18:22:49.800 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd trash purge schedule ls -p rbd2 2026-03-23T18:22:49.800 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls -p rbd2 2026-03-23T18:22:49.822 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:22:49.822 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls -p rbd2 -R 2026-03-23T18:22:49.822 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 2d starting at 00:17' 2026-03-23T18:22:49.846 INFO:tasks.workunit.client.0.vm04.stdout:- - every 2d starting at 00:17:00 2026-03-23T18:22:49.846 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls -p rbd2/ns1 -R 2026-03-23T18:22:49.846 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 2d starting at 00:17' 2026-03-23T18:22:49.869 INFO:tasks.workunit.client.0.vm04.stdout:- - every 2d starting at 00:17:00 2026-03-23T18:22:49.870 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash purge schedule ls -R -p rbd2/ns1 --format xml 2026-03-23T18:22:49.870 INFO:tasks.workunit.client.0.vm04.stderr:++ xmlstarlet sel -t -v //schedules/schedule/pool 2026-03-23T18:22:49.895 INFO:tasks.workunit.client.0.vm04.stderr:+ test - = - 2026-03-23T18:22:49.895 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash purge schedule ls -R -p rbd2/ns1 --format xml 2026-03-23T18:22:49.895 INFO:tasks.workunit.client.0.vm04.stderr:++ xmlstarlet sel -t -v //schedules/schedule/namespace 2026-03-23T18:22:50.114 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:22:50.110+0000 7efc7bfff640 0 --2- 192.168.123.104:0/353213422 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x56083d73e350 0x56083d7fc300 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 2026-03-23T18:22:50.122 INFO:tasks.workunit.client.0.vm04.stderr:+ test - = - 2026-03-23T18:22:50.122 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash purge schedule ls -R -p rbd2/ns1 --format xml 2026-03-23T18:22:50.122 INFO:tasks.workunit.client.0.vm04.stderr:++ xmlstarlet sel -t -v //schedules/schedule/items/item/start_time 2026-03-23T18:22:50.149 INFO:tasks.workunit.client.0.vm04.stderr:+ test 00:17:00 = 00:17:00 2026-03-23T18:22:50.149 INFO:tasks.workunit.client.0.vm04.stderr:++ seq 12 2026-03-23T18:22:50.150 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:22:50.150 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule status --format xml 2026-03-23T18:22:50.150 INFO:tasks.workunit.client.0.vm04.stderr:+ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-23T18:22:50.150 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2 2026-03-23T18:22:50.174 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:23:00.176 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:23:00.176 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule status --format xml 2026-03-23T18:23:00.176 INFO:tasks.workunit.client.0.vm04.stderr:+ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-23T18:23:00.176 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2 2026-03-23T18:23:00.200 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:23:10.201 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:23:10.201 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule status --format xml 2026-03-23T18:23:10.201 INFO:tasks.workunit.client.0.vm04.stderr:+ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-23T18:23:10.201 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2 2026-03-23T18:23:10.225 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:23:20.226 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:23:20.227 INFO:tasks.workunit.client.0.vm04.stderr:+ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-23T18:23:20.227 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule status --format xml 2026-03-23T18:23:20.227 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2 2026-03-23T18:23:20.253 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:23:30.255 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:23:30.255 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule status --format xml 2026-03-23T18:23:30.255 INFO:tasks.workunit.client.0.vm04.stderr:+ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-23T18:23:30.255 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2 2026-03-23T18:23:30.282 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:23:40.285 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:23:40.285 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule status --format xml 2026-03-23T18:23:40.285 INFO:tasks.workunit.client.0.vm04.stderr:+ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-23T18:23:40.285 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2 2026-03-23T18:23:40.319 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:23:50.321 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:23:50.321 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule status --format xml 2026-03-23T18:23:50.321 INFO:tasks.workunit.client.0.vm04.stderr:+ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-23T18:23:50.321 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2 2026-03-23T18:23:50.345 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 2026-03-23T18:23:50.345 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 2026-03-23T18:23:50.345 INFO:tasks.workunit.client.0.vm04.stderr:+ break 2026-03-23T18:23:50.345 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule status 2026-03-23T18:23:50.366 INFO:tasks.workunit.client.0.vm04.stdout:POOL NAMESPACE SCHEDULE TIME 2026-03-23T18:23:50.366 INFO:tasks.workunit.client.0.vm04.stdout:datapool 2026-03-24 00:17:00 2026-03-23T18:23:50.366 INFO:tasks.workunit.client.0.vm04.stdout:rbd 2026-03-24 01:30:00 2026-03-23T18:23:50.367 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 2026-03-24 00:17:00 2026-03-23T18:23:50.367 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 ns1 2026-03-24 00:17:00 2026-03-23T18:23:50.370 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule status --format xml 2026-03-23T18:23:50.370 INFO:tasks.workunit.client.0.vm04.stderr:+ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-23T18:23:50.370 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2 2026-03-23T18:23:50.393 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 2026-03-23T18:23:50.393 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 2026-03-23T18:23:50.393 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'rbd rbd2 rbd2' 2026-03-23T18:23:50.393 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash purge schedule status --format xml 2026-03-23T18:23:50.393 INFO:tasks.workunit.client.0.vm04.stderr:++ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-23T18:23:50.418 INFO:tasks.workunit.client.0.vm04.stderr:+ echo datapool rbd rbd2 rbd2 2026-03-23T18:23:50.418 INFO:tasks.workunit.client.0.vm04.stdout:datapool rbd rbd2 rbd2 2026-03-23T18:23:50.418 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash purge schedule status -p rbd --format xml 2026-03-23T18:23:50.418 INFO:tasks.workunit.client.0.vm04.stderr:++ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-23T18:23:50.445 INFO:tasks.workunit.client.0.vm04.stderr:+ test rbd = rbd 2026-03-23T18:23:50.445 INFO:tasks.workunit.client.0.vm04.stderr:+++ rbd trash purge schedule status -p rbd2 --format xml 2026-03-23T18:23:50.446 INFO:tasks.workunit.client.0.vm04.stderr:+++ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-23T18:23:50.472 INFO:tasks.workunit.client.0.vm04.stderr:++ echo rbd2 rbd2 2026-03-23T18:23:50.472 INFO:tasks.workunit.client.0.vm04.stderr:+ test 'rbd2 rbd2' = 'rbd2 rbd2' 2026-03-23T18:23:50.473 INFO:tasks.workunit.client.0.vm04.stderr:+++ rbd trash purge schedule ls -R --format xml 2026-03-23T18:23:50.473 INFO:tasks.workunit.client.0.vm04.stderr:+++ xmlstarlet sel -t -v //schedules/schedule/items 2026-03-23T18:23:50.499 INFO:tasks.workunit.client.0.vm04.stderr:++ echo 2d00:17:00 1d01:30:00 2026-03-23T18:23:50.499 INFO:tasks.workunit.client.0.vm04.stderr:+ test '2d00:17:00 1d01:30:00' = '2d00:17:00 1d01:30:00' 2026-03-23T18:23:50.499 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule add 1d 2026-03-23T18:23:50.531 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls 2026-03-23T18:23:50.531 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 2d starting at 00:17' 2026-03-23T18:23:50.556 INFO:tasks.workunit.client.0.vm04.stdout:every 1d, every 2d starting at 00:17:00 2026-03-23T18:23:50.556 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls 2026-03-23T18:23:50.556 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 1d' 2026-03-23T18:23:50.587 INFO:tasks.workunit.client.0.vm04.stdout:every 1d, every 2d starting at 00:17:00 2026-03-23T18:23:50.588 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls -R --format xml 2026-03-23T18:23:50.588 INFO:tasks.workunit.client.0.vm04.stderr:+ xmlstarlet sel -t -v //schedules/schedule/items 2026-03-23T18:23:50.588 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 2d00:17 2026-03-23T18:23:50.615 INFO:tasks.workunit.client.0.vm04.stdout:1d2d00:17:00 2026-03-23T18:23:50.615 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule rm 1d 2026-03-23T18:23:50.644 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls 2026-03-23T18:23:50.644 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 2d starting at 00:17' 2026-03-23T18:23:50.666 INFO:tasks.workunit.client.0.vm04.stdout:every 2d starting at 00:17:00 2026-03-23T18:23:50.666 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule rm 2d 00:17 2026-03-23T18:23:50.694 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd trash purge schedule ls 2026-03-23T18:23:50.694 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls 2026-03-23T18:23:50.716 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:23:50.716 INFO:tasks.workunit.client.0.vm04.stderr:+ for p in rbd2 rbd2/ns1 2026-03-23T18:23:50.716 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 -s 1 rbd2/ns1/test1 2026-03-23T18:23:50.745 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv rbd2/ns1/test1 2026-03-23T18:23:50.782 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls rbd2/ns1 2026-03-23T18:23:50.782 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:23:50.782 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^1$' 2026-03-23T18:23:50.807 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-23T18:23:50.807 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule add -p rbd2 1m 2026-03-23T18:23:50.839 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule list -p rbd2 -R 2026-03-23T18:23:50.839 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 1m' 2026-03-23T18:23:50.864 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 - every 1m 2026-03-23T18:23:50.864 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule list -p rbd2/ns1 -R 2026-03-23T18:23:50.864 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 1m' 2026-03-23T18:23:50.892 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 - every 1m 2026-03-23T18:23:50.892 INFO:tasks.workunit.client.0.vm04.stderr:++ seq 12 2026-03-23T18:23:50.893 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:23:50.893 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls rbd2/ns1 2026-03-23T18:23:50.893 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:23:50.893 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^1$' 2026-03-23T18:23:50.917 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-23T18:23:50.917 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:24:00.919 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:24:00.919 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls rbd2/ns1 2026-03-23T18:24:00.919 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:24:00.919 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^1$' 2026-03-23T18:24:00.945 INFO:tasks.workunit.client.0.vm04.stderr:+ break 2026-03-23T18:24:00.945 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls rbd2/ns1 2026-03-23T18:24:00.946 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:24:00.946 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^0$' 2026-03-23T18:24:00.968 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-23T18:24:00.968 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule list -p rbd2 -R 2026-03-23T18:24:00.968 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 1m' 2026-03-23T18:24:00.992 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 - every 1m 2026-03-23T18:24:00.997 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule list -p rbd2/ns1 -R 2026-03-23T18:24:00.997 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 1m' 2026-03-23T18:24:01.051 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 - every 1m 2026-03-23T18:24:01.051 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule status 2026-03-23T18:24:01.051 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'rbd2 *ns1' 2026-03-23T18:24:01.079 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 ns1 2026-03-23 18:25:00 2026-03-23T18:24:01.079 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule status -p rbd2 2026-03-23T18:24:01.079 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'rbd2 *ns1' 2026-03-23T18:24:01.106 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 ns1 2026-03-23 18:25:00 2026-03-23T18:24:01.106 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule status -p rbd2/ns1 2026-03-23T18:24:01.106 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'rbd2 *ns1' 2026-03-23T18:24:01.133 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 ns1 2026-03-23 18:25:00 2026-03-23T18:24:01.133 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule rm -p rbd2 1m 2026-03-23T18:24:01.164 INFO:tasks.workunit.client.0.vm04.stderr:+ for p in rbd2 rbd2/ns1 2026-03-23T18:24:01.164 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 -s 1 rbd2/ns1/test1 2026-03-23T18:24:01.200 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv rbd2/ns1/test1 2026-03-23T18:24:01.245 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls rbd2/ns1 2026-03-23T18:24:01.245 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:24:01.245 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^1$' 2026-03-23T18:24:01.269 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-23T18:24:01.269 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule add -p rbd2/ns1 1m 2026-03-23T18:24:01.307 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 1m' 2026-03-23T18:24:01.307 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule list -p rbd2 -R 2026-03-23T18:24:01.334 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 ns1 every 1m 2026-03-23T18:24:01.335 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule list -p rbd2/ns1 -R 2026-03-23T18:24:01.335 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 1m' 2026-03-23T18:24:01.364 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 ns1 every 1m 2026-03-23T18:24:01.364 INFO:tasks.workunit.client.0.vm04.stderr:++ seq 12 2026-03-23T18:24:01.365 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:24:01.365 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls rbd2/ns1 2026-03-23T18:24:01.365 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:24:01.365 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^1$' 2026-03-23T18:24:01.391 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-23T18:24:01.391 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:24:11.392 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:24:11.393 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls rbd2/ns1 2026-03-23T18:24:11.393 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:24:11.393 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^1$' 2026-03-23T18:24:11.417 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-23T18:24:11.417 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:24:21.418 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:24:21.418 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls rbd2/ns1 2026-03-23T18:24:21.418 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:24:21.418 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^1$' 2026-03-23T18:24:21.442 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-23T18:24:21.442 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:24:31.443 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:24:31.443 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls rbd2/ns1 2026-03-23T18:24:31.443 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:24:31.443 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^1$' 2026-03-23T18:24:31.467 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-23T18:24:31.468 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:24:41.469 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:24:41.469 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls rbd2/ns1 2026-03-23T18:24:41.469 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:24:41.469 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^1$' 2026-03-23T18:24:41.499 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-23T18:24:41.500 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:24:51.501 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:24:51.501 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls rbd2/ns1 2026-03-23T18:24:51.501 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:24:51.501 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^1$' 2026-03-23T18:24:51.526 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-23T18:24:51.526 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:25:01.527 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:25:01.528 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls rbd2/ns1 2026-03-23T18:25:01.528 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:25:01.528 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^1$' 2026-03-23T18:25:01.553 INFO:tasks.workunit.client.0.vm04.stderr:+ break 2026-03-23T18:25:01.553 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls rbd2/ns1 2026-03-23T18:25:01.553 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-23T18:25:01.553 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^0$' 2026-03-23T18:25:01.577 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-23T18:25:01.578 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule list -p rbd2 -R 2026-03-23T18:25:01.578 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 1m' 2026-03-23T18:25:01.603 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 ns1 every 1m 2026-03-23T18:25:01.604 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule list -p rbd2/ns1 -R 2026-03-23T18:25:01.604 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 1m' 2026-03-23T18:25:01.628 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 ns1 every 1m 2026-03-23T18:25:01.629 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule status 2026-03-23T18:25:01.629 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'rbd2 *ns1' 2026-03-23T18:25:01.651 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 ns1 2026-03-23 18:26:00 2026-03-23T18:25:01.652 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule status -p rbd2 2026-03-23T18:25:01.652 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'rbd2 *ns1' 2026-03-23T18:25:01.676 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 ns1 2026-03-23 18:26:00 2026-03-23T18:25:01.676 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule status -p rbd2/ns1 2026-03-23T18:25:01.676 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'rbd2 *ns1' 2026-03-23T18:25:01.700 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 ns1 2026-03-23 18:26:00 2026-03-23T18:25:01.700 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule rm -p rbd2/ns1 1m 2026-03-23T18:25:01.725 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule add 2m 2026-03-23T18:25:01.752 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd trash purge schedule add -p rbd dummy 2026-03-23T18:25:01.752 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule add -p rbd dummy 2026-03-23T18:25:01.773 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-23T18:25:01.770+0000 7f9fbb311640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-23T18:25:01.774 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd trash purge schedule add failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-23T18:25:01.776 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:25:01.776 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd trash purge schedule add -p rbd 1d dummy 2026-03-23T18:25:01.776 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule add -p rbd 1d dummy 2026-03-23T18:25:01.797 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-23T18:25:01.794+0000 7f9fbb311640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-23T18:25:01.797 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd trash purge schedule add failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-23T18:25:01.799 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:25:01.799 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd trash purge schedule add dummy 2026-03-23T18:25:01.799 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule add dummy 2026-03-23T18:25:01.819 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-23T18:25:01.818+0000 7f9fbb311640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-23T18:25:01.820 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd trash purge schedule add failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-23T18:25:01.822 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:25:01.822 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd trash purge schedule add 1d dummy 2026-03-23T18:25:01.822 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule add 1d dummy 2026-03-23T18:25:01.842 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-23T18:25:01.838+0000 7f9fbb311640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-23T18:25:01.842 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd trash purge schedule add failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-23T18:25:01.845 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:25:01.845 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd trash purge schedule remove -p rbd dummy 2026-03-23T18:25:01.845 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule remove -p rbd dummy 2026-03-23T18:25:01.866 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-23T18:25:01.862+0000 7f9fbb311640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-23T18:25:01.866 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd trash purge schedule remove failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-23T18:25:01.868 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:25:01.868 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd trash purge schedule remove -p rbd 1d dummy 2026-03-23T18:25:01.868 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule remove -p rbd 1d dummy 2026-03-23T18:25:01.889 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-23T18:25:01.886+0000 7f9fbb311640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-23T18:25:01.890 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd trash purge schedule remove failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-23T18:25:01.892 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:25:01.892 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd trash purge schedule remove dummy 2026-03-23T18:25:01.892 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule remove dummy 2026-03-23T18:25:01.911 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-23T18:25:01.910+0000 7f9fbb311640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-23T18:25:01.911 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd trash purge schedule remove failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-23T18:25:01.913 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:25:01.913 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd trash purge schedule remove 1d dummy 2026-03-23T18:25:01.913 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule remove 1d dummy 2026-03-23T18:25:01.934 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-23T18:25:01.930+0000 7f9fbb311640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-23T18:25:01.934 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd trash purge schedule remove failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-23T18:25:01.936 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:25:01.936 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls -p rbd 2026-03-23T18:25:01.936 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 1d starting at 01:30' 2026-03-23T18:25:01.959 INFO:tasks.workunit.client.0.vm04.stdout:every 1d starting at 01:30:00 2026-03-23T18:25:01.960 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls 2026-03-23T18:25:01.960 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 2m' 2026-03-23T18:25:01.982 INFO:tasks.workunit.client.0.vm04.stdout:every 2m 2026-03-23T18:25:01.982 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule remove -p rbd 1d 01:30 2026-03-23T18:25:02.011 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule remove 2m 2026-03-23T18:25:02.043 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash purge schedule ls -R --format json 2026-03-23T18:25:02.068 INFO:tasks.workunit.client.0.vm04.stderr:+ test '[]' = '[]' 2026-03-23T18:25:02.068 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-23T18:25:02.068 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:02.139 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:02.213 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:02.282 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:02.348 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:02.426 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:02.516 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:02.598 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:02.689 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:02.974 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:03.045 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:03.320 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:03.394 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:03.461 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:03.523 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:03.585 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:03.647 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:03.710 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:03.771 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool rm rbd2 rbd2 --yes-i-really-really-mean-it 2026-03-23T18:25:04.233 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd2' does not exist 2026-03-23T18:25:04.250 INFO:tasks.workunit.client.0.vm04.stdout:testing recovery of trash_purge_schedule handler after module's RADOS client is blocklisted... 2026-03-23T18:25:04.251 INFO:tasks.workunit.client.0.vm04.stderr:+ test_trash_purge_schedule_recovery 2026-03-23T18:25:04.251 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing recovery of trash_purge_schedule handler after module'\''s RADOS client is blocklisted...' 2026-03-23T18:25:04.251 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-23T18:25:04.251 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:04.326 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:04.402 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:04.483 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:04.563 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:04.632 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:04.700 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:04.761 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:04.825 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:04.891 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:04.957 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:05.023 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:05.095 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:05.176 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:05.248 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:05.317 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:05.387 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:05.469 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:05.540 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool create rbd3 8 2026-03-23T18:25:06.241 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd3' already exists 2026-03-23T18:25:06.254 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd pool init rbd3 2026-03-23T18:25:09.208 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd namespace create rbd3/ns1 2026-03-23T18:25:09.237 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule add -p rbd3/ns1 2d 2026-03-23T18:25:09.266 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls -p rbd3 -R 2026-03-23T18:25:09.266 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'rbd3 *ns1 *every 2d' 2026-03-23T18:25:09.291 INFO:tasks.workunit.client.0.vm04.stdout:rbd3 ns1 every 2d 2026-03-23T18:25:09.292 INFO:tasks.workunit.client.0.vm04.stderr:++ ceph mgr dump 2026-03-23T18:25:09.292 INFO:tasks.workunit.client.0.vm04.stderr:++ jq 'select(.name == "rbd_support")' 2026-03-23T18:25:09.292 INFO:tasks.workunit.client.0.vm04.stderr:++ jq '.active_clients[]' 2026-03-23T18:25:09.292 INFO:tasks.workunit.client.0.vm04.stderr:++ jq -r '[.addrvec[0].addr, "/", .addrvec[0].nonce|tostring] | add' 2026-03-23T18:25:09.560 INFO:tasks.workunit.client.0.vm04.stderr:+ CLIENT_ADDR=192.168.123.104:0/2400433533 2026-03-23T18:25:09.560 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd blocklist add 192.168.123.104:0/2400433533 2026-03-23T18:25:11.211 INFO:tasks.workunit.client.0.vm04.stderr:blocklisting 192.168.123.104:0/2400433533 until 2026-03-23T19:25:10.263558+0000 (3600 sec) 2026-03-23T18:25:11.226 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd trash purge schedule add -p rbd3 10m 2026-03-23T18:25:11.226 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule add -p rbd3 10m 2026-03-23T18:25:11.454 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-23T18:25:11.450+0000 7f9fbb311640 -1 mgr.server reply reply (11) Resource temporarily unavailable [errno 108] RADOS connection was shutdown (Failed to operate write op for oid rbd_trash_purge_schedule) 2026-03-23T18:25:11.454 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd trash purge schedule add failed: (11) Resource temporarily unavailable: [errno 108] RADOS connection was shutdown (Failed to operate write op for oid rbd_trash_purge_schedule) 2026-03-23T18:25:11.458 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:25:11.458 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:25:21.459 INFO:tasks.workunit.client.0.vm04.stderr:++ seq 24 2026-03-23T18:25:21.460 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 24` 2026-03-23T18:25:21.460 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule add -p rbd3 10m 2026-03-23T18:25:21.484 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-23T18:25:21.482+0000 7f9fbb311640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-23T18:25:21.485 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd trash purge schedule add failed: (11) Resource temporarily unavailable: rbd_support module is not ready, try again 2026-03-23T18:25:21.491 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:25:31.492 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 24` 2026-03-23T18:25:31.492 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule add -p rbd3 10m 2026-03-23T18:25:31.513 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-23T18:25:31.510+0000 7f9fbb311640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-23T18:25:31.513 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd trash purge schedule add failed: (11) Resource temporarily unavailable: rbd_support module is not ready, try again 2026-03-23T18:25:31.515 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:25:41.517 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 24` 2026-03-23T18:25:41.517 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule add -p rbd3 10m 2026-03-23T18:25:41.540 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-23T18:25:41.538+0000 7f9fbb311640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-23T18:25:41.540 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd trash purge schedule add failed: (11) Resource temporarily unavailable: rbd_support module is not ready, try again 2026-03-23T18:25:41.543 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:25:51.544 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 24` 2026-03-23T18:25:51.544 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule add -p rbd3 10m 2026-03-23T18:25:51.590 INFO:tasks.workunit.client.0.vm04.stderr:+ break 2026-03-23T18:25:51.591 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls -p rbd3 -R 2026-03-23T18:25:51.591 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 10m' 2026-03-23T18:25:51.618 INFO:tasks.workunit.client.0.vm04.stdout:rbd3 - every 10m 2026-03-23T18:25:51.618 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls -p rbd3 -R 2026-03-23T18:25:51.618 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'rbd3 *ns1 *every 2d' 2026-03-23T18:25:51.645 INFO:tasks.workunit.client.0.vm04.stdout:rbd3 ns1 every 2d 2026-03-23T18:25:51.646 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule remove -p rbd3 10m 2026-03-23T18:25:51.675 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule remove -p rbd3/ns1 2d 2026-03-23T18:25:51.704 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls -p rbd3 -R 2026-03-23T18:25:51.704 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail grep 'every 10m' 2026-03-23T18:25:51.704 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 10m' 2026-03-23T18:25:51.729 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:25:51.729 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls -p rbd3 -R 2026-03-23T18:25:51.729 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail grep 'rbd3 *ns1 *every 2d' 2026-03-23T18:25:51.729 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'rbd3 *ns1 *every 2d' 2026-03-23T18:25:51.757 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:25:51.757 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool rm rbd3 rbd3 --yes-i-really-really-mean-it 2026-03-23T18:25:52.230 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd3' does not exist 2026-03-23T18:25:52.243 INFO:tasks.workunit.client.0.vm04.stdout:testing mirror snapshot schedule... 2026-03-23T18:25:52.243 INFO:tasks.workunit.client.0.vm04.stderr:+ test_mirror_snapshot_schedule 2026-03-23T18:25:52.243 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing mirror snapshot schedule...' 2026-03-23T18:25:52.243 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-23T18:25:52.243 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:52.324 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:52.405 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:52.477 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:52.558 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:52.633 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:52.702 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:52.772 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:52.852 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:52.920 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:52.988 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:53.057 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:53.128 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:53.198 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:53.269 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:53.337 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:53.608 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:53.675 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:25:53.744 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool create rbd2 8 2026-03-23T18:25:54.232 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd2' already exists 2026-03-23T18:25:54.244 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd pool init rbd2 2026-03-23T18:25:57.207 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd namespace create rbd2/ns1 2026-03-23T18:25:57.239 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror pool enable rbd2 image 2026-03-23T18:25:57.267 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror pool enable rbd2/ns1 image 2026-03-23T18:25:57.299 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror pool peer add rbd2 cluster1 2026-03-23T18:25:57.324 INFO:tasks.workunit.client.0.vm04.stdout:08b14619-760b-4956-9383-7a1f8f36e10f 2026-03-23T18:25:57.327 INFO:tasks.workunit.client.0.vm04.stderr:++ ceph rbd mirror snapshot schedule list 2026-03-23T18:25:57.613 INFO:tasks.workunit.client.0.vm04.stderr:+ test '{}' = '{}' 2026-03-23T18:25:57.613 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph rbd mirror snapshot schedule status 2026-03-23T18:25:57.613 INFO:tasks.workunit.client.0.vm04.stderr:+ fgrep '"scheduled_images": []' 2026-03-23T18:25:57.875 INFO:tasks.workunit.client.0.vm04.stdout: "scheduled_images": [] 2026-03-23T18:25:57.875 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd mirror snapshot schedule ls 2026-03-23T18:25:57.875 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls 2026-03-23T18:25:57.898 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:25:57.898 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror snapshot schedule ls -R --format json 2026-03-23T18:25:57.921 INFO:tasks.workunit.client.0.vm04.stderr:+ test '[]' = '[]' 2026-03-23T18:25:57.921 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 -s 1 rbd2/ns1/test1 2026-03-23T18:25:57.951 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror image status rbd2/ns1/test1 2026-03-23T18:25:57.952 INFO:tasks.workunit.client.0.vm04.stderr:++ grep -c mirror.primary 2026-03-23T18:25:57.976 INFO:tasks.workunit.client.0.vm04.stderr:rbd: mirroring not enabled on the image 2026-03-23T18:25:57.980 INFO:tasks.workunit.client.0.vm04.stderr:+ test 0 = 0 2026-03-23T18:25:57.980 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror image enable rbd2/ns1/test1 snapshot 2026-03-23T18:25:58.214 INFO:tasks.workunit.client.0.vm04.stdout:Mirroring enabled 2026-03-23T18:25:58.221 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror image status rbd2/ns1/test1 2026-03-23T18:25:58.221 INFO:tasks.workunit.client.0.vm04.stderr:++ grep -c mirror.primary 2026-03-23T18:25:58.251 INFO:tasks.workunit.client.0.vm04.stderr:+ test 1 = 1 2026-03-23T18:25:58.251 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd mirror snapshot schedule remove dummy 2026-03-23T18:25:58.251 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule remove dummy 2026-03-23T18:25:58.272 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-23T18:25:58.270+0000 7f9fbb311640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-23T18:25:58.272 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd mirror snapshot schedule remove failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-23T18:25:58.274 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:25:58.274 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd mirror snapshot schedule remove 1h dummy 2026-03-23T18:25:58.274 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule remove 1h dummy 2026-03-23T18:25:58.295 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-23T18:25:58.294+0000 7f9fbb311640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-23T18:25:58.295 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd mirror snapshot schedule remove failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-23T18:25:58.298 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:25:58.298 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd mirror snapshot schedule remove -p rbd2/ns1 --image test1 dummy 2026-03-23T18:25:58.298 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule remove -p rbd2/ns1 --image test1 dummy 2026-03-23T18:25:58.326 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-23T18:25:58.322+0000 7f9fbb311640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-23T18:25:58.326 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd mirror snapshot schedule remove failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-23T18:25:58.329 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:25:58.329 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd mirror snapshot schedule remove -p rbd2/ns1 --image test1 1h dummy 2026-03-23T18:25:58.329 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule remove -p rbd2/ns1 --image test1 1h dummy 2026-03-23T18:25:58.356 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-23T18:25:58.354+0000 7f9fbb311640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-23T18:25:58.357 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd mirror snapshot schedule remove failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-23T18:25:58.359 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:25:58.359 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule add -p rbd2/ns1 --image test1 1m 2026-03-23T18:25:58.390 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd mirror snapshot schedule ls 2026-03-23T18:25:58.390 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls 2026-03-23T18:25:58.413 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:25:58.413 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'rbd2 *ns1 *test1 *every 1m' 2026-03-23T18:25:58.413 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls -R 2026-03-23T18:25:58.435 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 ns1 test1 every 1m 2026-03-23T18:25:58.435 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd mirror snapshot schedule ls -p rbd2 2026-03-23T18:25:58.435 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls -p rbd2 2026-03-23T18:25:58.458 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:25:58.459 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls -p rbd2 -R 2026-03-23T18:25:58.459 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'rbd2 *ns1 *test1 *every 1m' 2026-03-23T18:25:58.482 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 ns1 test1 every 1m 2026-03-23T18:25:58.482 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd mirror snapshot schedule ls -p rbd2/ns1 2026-03-23T18:25:58.482 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls -p rbd2/ns1 2026-03-23T18:25:58.505 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:25:58.505 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls -p rbd2/ns1 -R 2026-03-23T18:25:58.505 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'rbd2 *ns1 *test1 *every 1m' 2026-03-23T18:25:58.529 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 ns1 test1 every 1m 2026-03-23T18:25:58.529 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror snapshot schedule ls -p rbd2/ns1 --image test1 2026-03-23T18:25:58.559 INFO:tasks.workunit.client.0.vm04.stderr:+ test 'every 1m' = 'every 1m' 2026-03-23T18:25:58.559 INFO:tasks.workunit.client.0.vm04.stderr:++ seq 12 2026-03-23T18:25:58.559 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:25:58.559 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror image status rbd2/ns1/test1 2026-03-23T18:25:58.560 INFO:tasks.workunit.client.0.vm04.stderr:++ grep -c mirror.primary 2026-03-23T18:25:58.589 INFO:tasks.workunit.client.0.vm04.stderr:+ test 1 -gt 1 2026-03-23T18:25:58.590 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:26:08.590 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:26:08.591 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror image status rbd2/ns1/test1 2026-03-23T18:26:08.591 INFO:tasks.workunit.client.0.vm04.stderr:++ grep -c mirror.primary 2026-03-23T18:26:08.623 INFO:tasks.workunit.client.0.vm04.stderr:+ test 1 -gt 1 2026-03-23T18:26:08.623 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:26:18.624 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:26:18.624 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror image status rbd2/ns1/test1 2026-03-23T18:26:18.624 INFO:tasks.workunit.client.0.vm04.stderr:++ grep -c mirror.primary 2026-03-23T18:26:18.658 INFO:tasks.workunit.client.0.vm04.stderr:+ test 1 -gt 1 2026-03-23T18:26:18.658 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:26:28.659 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:26:28.660 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror image status rbd2/ns1/test1 2026-03-23T18:26:28.660 INFO:tasks.workunit.client.0.vm04.stderr:++ grep -c mirror.primary 2026-03-23T18:26:28.691 INFO:tasks.workunit.client.0.vm04.stderr:+ test 1 -gt 1 2026-03-23T18:26:28.691 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:26:38.692 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:26:38.692 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror image status rbd2/ns1/test1 2026-03-23T18:26:38.692 INFO:tasks.workunit.client.0.vm04.stderr:++ grep -c mirror.primary 2026-03-23T18:26:38.724 INFO:tasks.workunit.client.0.vm04.stderr:+ test 1 -gt 1 2026-03-23T18:26:38.725 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:26:48.726 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:26:48.726 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror image status rbd2/ns1/test1 2026-03-23T18:26:48.726 INFO:tasks.workunit.client.0.vm04.stderr:++ grep -c mirror.primary 2026-03-23T18:26:48.763 INFO:tasks.workunit.client.0.vm04.stderr:+ test 1 -gt 1 2026-03-23T18:26:48.764 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:26:58.765 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:26:58.765 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror image status rbd2/ns1/test1 2026-03-23T18:26:58.765 INFO:tasks.workunit.client.0.vm04.stderr:++ grep -c mirror.primary 2026-03-23T18:26:58.799 INFO:tasks.workunit.client.0.vm04.stderr:+ test 1 -gt 1 2026-03-23T18:26:58.799 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:27:08.800 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:27:08.800 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror image status rbd2/ns1/test1 2026-03-23T18:27:08.800 INFO:tasks.workunit.client.0.vm04.stderr:++ grep -c mirror.primary 2026-03-23T18:27:08.830 INFO:tasks.workunit.client.0.vm04.stderr:+ test 2 -gt 1 2026-03-23T18:27:08.831 INFO:tasks.workunit.client.0.vm04.stderr:+ break 2026-03-23T18:27:08.831 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror image status rbd2/ns1/test1 2026-03-23T18:27:08.831 INFO:tasks.workunit.client.0.vm04.stderr:++ grep -c mirror.primary 2026-03-23T18:27:08.860 INFO:tasks.workunit.client.0.vm04.stderr:+ test 2 -gt 1 2026-03-23T18:27:08.860 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd mirror snapshot schedule ls 2026-03-23T18:27:08.860 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls 2026-03-23T18:27:08.883 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:27:08.883 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls -R 2026-03-23T18:27:08.883 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'rbd2 *ns1 *test1 *every 1m' 2026-03-23T18:27:08.910 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 ns1 test1 every 1m 2026-03-23T18:27:08.910 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd mirror snapshot schedule ls -p rbd2 2026-03-23T18:27:08.910 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls -p rbd2 2026-03-23T18:27:08.934 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:27:08.934 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls -p rbd2 -R 2026-03-23T18:27:08.934 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'rbd2 *ns1 *test1 *every 1m' 2026-03-23T18:27:08.959 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 ns1 test1 every 1m 2026-03-23T18:27:08.960 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd mirror snapshot schedule ls -p rbd2/ns1 2026-03-23T18:27:08.960 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls -p rbd2/ns1 2026-03-23T18:27:08.984 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:27:08.984 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls -p rbd2/ns1 -R 2026-03-23T18:27:08.984 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'rbd2 *ns1 *test1 *every 1m' 2026-03-23T18:27:09.010 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 ns1 test1 every 1m 2026-03-23T18:27:09.010 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror snapshot schedule ls -p rbd2/ns1 --image test1 2026-03-23T18:27:09.040 INFO:tasks.workunit.client.0.vm04.stderr:+ test 'every 1m' = 'every 1m' 2026-03-23T18:27:09.040 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-23T18:27:09.061 INFO:tasks.workunit.client.0.vm04.stdout:SCHEDULE TIME IMAGE 2026-03-23T18:27:09.061 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-23 18:28:00 rbd2/ns1/test1 2026-03-23T18:27:09.063 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror snapshot schedule status --format xml 2026-03-23T18:27:09.063 INFO:tasks.workunit.client.0.vm04.stderr:++ xmlstarlet sel -t -v //scheduled_images/image/image 2026-03-23T18:27:09.087 INFO:tasks.workunit.client.0.vm04.stderr:+ test rbd2/ns1/test1 = rbd2/ns1/test1 2026-03-23T18:27:09.088 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror snapshot schedule status -p rbd2 --format xml 2026-03-23T18:27:09.088 INFO:tasks.workunit.client.0.vm04.stderr:++ xmlstarlet sel -t -v //scheduled_images/image/image 2026-03-23T18:27:09.112 INFO:tasks.workunit.client.0.vm04.stderr:+ test rbd2/ns1/test1 = rbd2/ns1/test1 2026-03-23T18:27:09.112 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror snapshot schedule status -p rbd2/ns1 --format xml 2026-03-23T18:27:09.112 INFO:tasks.workunit.client.0.vm04.stderr:++ xmlstarlet sel -t -v //scheduled_images/image/image 2026-03-23T18:27:09.138 INFO:tasks.workunit.client.0.vm04.stderr:+ test rbd2/ns1/test1 = rbd2/ns1/test1 2026-03-23T18:27:09.138 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror snapshot schedule status -p rbd2/ns1 --image test1 --format xml 2026-03-23T18:27:09.138 INFO:tasks.workunit.client.0.vm04.stderr:++ xmlstarlet sel -t -v //scheduled_images/image/image 2026-03-23T18:27:09.170 INFO:tasks.workunit.client.0.vm04.stderr:+ test rbd2/ns1/test1 = rbd2/ns1/test1 2026-03-23T18:27:09.170 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror image demote rbd2/ns1/test1 2026-03-23T18:27:09.351 INFO:tasks.workunit.client.0.vm04.stdout:Image demoted to non-primary 2026-03-23T18:27:09.356 INFO:tasks.workunit.client.0.vm04.stderr:++ seq 12 2026-03-23T18:27:09.357 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:27:09.357 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-23T18:27:09.357 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/ns1/test1 2026-03-23T18:27:09.380 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-23 18:28:00 rbd2/ns1/test1 2026-03-23T18:27:09.380 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:27:19.382 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:27:19.382 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-23T18:27:19.382 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/ns1/test1 2026-03-23T18:27:19.404 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-23 18:28:00 rbd2/ns1/test1 2026-03-23T18:27:19.404 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:27:29.405 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:27:29.405 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-23T18:27:29.405 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/ns1/test1 2026-03-23T18:27:29.428 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-23 18:28:00 rbd2/ns1/test1 2026-03-23T18:27:29.428 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:27:39.429 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:27:39.429 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-23T18:27:39.429 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/ns1/test1 2026-03-23T18:27:39.452 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-23 18:28:00 rbd2/ns1/test1 2026-03-23T18:27:39.452 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:27:49.453 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:27:49.453 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-23T18:27:49.453 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/ns1/test1 2026-03-23T18:27:49.476 INFO:tasks.workunit.client.0.vm04.stderr:+ break 2026-03-23T18:27:49.476 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-23T18:27:49.476 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail grep rbd2/ns1/test1 2026-03-23T18:27:49.476 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/ns1/test1 2026-03-23T18:27:49.498 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:27:49.498 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror image promote rbd2/ns1/test1 2026-03-23T18:27:50.095 INFO:tasks.workunit.client.0.vm04.stdout:Image promoted to primary 2026-03-23T18:27:50.101 INFO:tasks.workunit.client.0.vm04.stderr:++ seq 12 2026-03-23T18:27:50.102 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:27:50.102 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-23T18:27:50.102 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/ns1/test1 2026-03-23T18:27:50.125 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:28:00.127 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:28:00.127 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-23T18:28:00.127 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/ns1/test1 2026-03-23T18:28:00.151 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:28:10.152 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:28:10.153 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-23T18:28:10.153 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/ns1/test1 2026-03-23T18:28:10.176 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:28:20.178 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:28:20.178 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-23T18:28:20.178 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/ns1/test1 2026-03-23T18:28:20.201 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:28:30.203 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:28:30.203 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-23T18:28:30.203 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/ns1/test1 2026-03-23T18:28:30.226 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:28:40.227 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:28:40.227 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-23T18:28:40.227 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/ns1/test1 2026-03-23T18:28:40.250 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:28:50.251 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:28:50.251 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-23T18:28:50.251 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/ns1/test1 2026-03-23T18:28:50.274 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-23 18:29:00 rbd2/ns1/test1 2026-03-23T18:28:50.274 INFO:tasks.workunit.client.0.vm04.stderr:+ break 2026-03-23T18:28:50.274 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-23T18:28:50.274 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/ns1/test1 2026-03-23T18:28:50.296 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-23 18:29:00 rbd2/ns1/test1 2026-03-23T18:28:50.296 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule add 1h 00:15 2026-03-23T18:28:50.325 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror snapshot schedule ls 2026-03-23T18:28:50.347 INFO:tasks.workunit.client.0.vm04.stderr:+ test 'every 1h starting at 00:15:00' = 'every 1h starting at 00:15:00' 2026-03-23T18:28:50.347 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls -R 2026-03-23T18:28:50.347 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 1h starting at 00:15:00' 2026-03-23T18:28:50.369 INFO:tasks.workunit.client.0.vm04.stdout:- - - every 1h starting at 00:15:00 2026-03-23T18:28:50.369 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls -R 2026-03-23T18:28:50.369 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'rbd2 *ns1 *test1 *every 1m' 2026-03-23T18:28:50.392 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 ns1 test1 every 1m 2026-03-23T18:28:50.392 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd mirror snapshot schedule ls -p rbd2 2026-03-23T18:28:50.392 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls -p rbd2 2026-03-23T18:28:50.415 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:28:50.415 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls -p rbd2 -R 2026-03-23T18:28:50.415 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 1h starting at 00:15:00' 2026-03-23T18:28:50.439 INFO:tasks.workunit.client.0.vm04.stdout:- - - every 1h starting at 00:15:00 2026-03-23T18:28:50.439 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls -p rbd2 -R 2026-03-23T18:28:50.439 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'rbd2 *ns1 *test1 *every 1m' 2026-03-23T18:28:50.462 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 ns1 test1 every 1m 2026-03-23T18:28:50.462 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd mirror snapshot schedule ls -p rbd2/ns1 2026-03-23T18:28:50.462 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls -p rbd2/ns1 2026-03-23T18:28:50.486 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:28:50.486 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls -p rbd2/ns1 -R 2026-03-23T18:28:50.486 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 1h starting at 00:15:00' 2026-03-23T18:28:50.510 INFO:tasks.workunit.client.0.vm04.stdout:- - - every 1h starting at 00:15:00 2026-03-23T18:28:50.511 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls -p rbd2/ns1 -R 2026-03-23T18:28:50.511 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'rbd2 *ns1 *test1 *every 1m' 2026-03-23T18:28:50.535 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 ns1 test1 every 1m 2026-03-23T18:28:50.535 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror snapshot schedule ls -p rbd2/ns1 --image test1 2026-03-23T18:28:50.564 INFO:tasks.workunit.client.0.vm04.stderr:+ test 'every 1m' = 'every 1m' 2026-03-23T18:28:50.564 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd mirror snapshot schedule add dummy 2026-03-23T18:28:50.564 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule add dummy 2026-03-23T18:28:50.585 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-23T18:28:50.582+0000 7f9fbb311640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-23T18:28:50.585 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd mirror snapshot schedule add failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-23T18:28:50.587 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:28:50.587 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd mirror snapshot schedule add 1h dummy 2026-03-23T18:28:50.587 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule add 1h dummy 2026-03-23T18:28:50.608 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-23T18:28:50.606+0000 7f9fbb311640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-23T18:28:50.608 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd mirror snapshot schedule add failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-23T18:28:50.610 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:28:50.610 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd mirror snapshot schedule add -p rbd2/ns1 --image test1 dummy 2026-03-23T18:28:50.610 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule add -p rbd2/ns1 --image test1 dummy 2026-03-23T18:28:50.637 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-23T18:28:50.634+0000 7f9fbb311640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-23T18:28:50.638 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd mirror snapshot schedule add failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-23T18:28:50.640 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:28:50.640 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd mirror snapshot schedule add -p rbd2/ns1 --image test1 1h dummy 2026-03-23T18:28:50.640 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule add -p rbd2/ns1 --image test1 1h dummy 2026-03-23T18:28:50.668 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-23T18:28:50.666+0000 7f9fbb311640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-23T18:28:50.668 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd mirror snapshot schedule add failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-23T18:28:50.671 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:28:50.671 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd mirror snapshot schedule remove dummy 2026-03-23T18:28:50.671 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule remove dummy 2026-03-23T18:28:50.691 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-23T18:28:50.690+0000 7f9fbb311640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-23T18:28:50.692 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd mirror snapshot schedule remove failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-23T18:28:50.694 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:28:50.694 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd mirror snapshot schedule remove 1h dummy 2026-03-23T18:28:50.694 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule remove 1h dummy 2026-03-23T18:28:50.715 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-23T18:28:50.714+0000 7f9fbb311640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-23T18:28:50.715 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd mirror snapshot schedule remove failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-23T18:28:50.718 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:28:50.718 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd mirror snapshot schedule remove -p rbd2/ns1 --image test1 dummy 2026-03-23T18:28:50.718 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule remove -p rbd2/ns1 --image test1 dummy 2026-03-23T18:28:50.745 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-23T18:28:50.742+0000 7f9fbb311640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-23T18:28:50.745 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd mirror snapshot schedule remove failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-23T18:28:50.748 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:28:50.748 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd mirror snapshot schedule remove -p rbd2/ns1 --image test1 1h dummy 2026-03-23T18:28:50.748 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule remove -p rbd2/ns1 --image test1 1h dummy 2026-03-23T18:28:50.777 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-23T18:28:50.774+0000 7f9fbb311640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-23T18:28:50.777 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd mirror snapshot schedule remove failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-23T18:28:50.779 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:28:50.780 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror snapshot schedule ls 2026-03-23T18:28:50.803 INFO:tasks.workunit.client.0.vm04.stderr:+ test 'every 1h starting at 00:15:00' = 'every 1h starting at 00:15:00' 2026-03-23T18:28:50.803 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror snapshot schedule ls -p rbd2/ns1 --image test1 2026-03-23T18:28:50.833 INFO:tasks.workunit.client.0.vm04.stderr:+ test 'every 1m' = 'every 1m' 2026-03-23T18:28:50.833 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm rbd2/ns1/test1 2026-03-23T18:28:53.340 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:28:53.338+0000 7f6f9ecb0640 0 -- 192.168.123.104:0/1532749487 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x55b6399c0b20 msgr2=0x55b6399b19c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:28:54.339 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:28:54.338+0000 7f6f9ecb0640 0 -- 192.168.123.104:0/1532749487 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7f6f8005dd90 msgr2=0x7f6f8007e170 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:28:54.361 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:28:54.365 INFO:tasks.workunit.client.0.vm04.stderr:++ seq 12 2026-03-23T18:28:54.366 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:28:54.366 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-23T18:28:54.366 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/ns1/test1 2026-03-23T18:28:54.389 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-23 18:29:00 rbd2/ns1/test1 2026-03-23T18:28:54.389 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:29:04.390 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:29:04.390 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-23T18:29:04.391 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/ns1/test1 2026-03-23T18:29:04.414 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-23 18:30:00 rbd2/ns1/test1 2026-03-23T18:29:04.414 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:29:14.415 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:29:14.415 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-23T18:29:14.415 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/ns1/test1 2026-03-23T18:29:14.439 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-23 18:30:00 rbd2/ns1/test1 2026-03-23T18:29:14.439 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:29:24.440 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:29:24.440 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-23T18:29:24.440 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/ns1/test1 2026-03-23T18:29:24.463 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-23 18:30:00 rbd2/ns1/test1 2026-03-23T18:29:24.463 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:29:34.465 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:29:34.465 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-23T18:29:34.465 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/ns1/test1 2026-03-23T18:29:34.488 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-23 18:30:00 rbd2/ns1/test1 2026-03-23T18:29:34.488 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:29:44.489 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:29:44.489 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-23T18:29:44.489 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/ns1/test1 2026-03-23T18:29:44.512 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-23 18:30:00 rbd2/ns1/test1 2026-03-23T18:29:44.512 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:29:54.513 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-23T18:29:54.513 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-23T18:29:54.513 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/ns1/test1 2026-03-23T18:29:54.536 INFO:tasks.workunit.client.0.vm04.stderr:+ break 2026-03-23T18:29:54.536 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-23T18:29:54.536 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail grep rbd2/ns1/test1 2026-03-23T18:29:54.536 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/ns1/test1 2026-03-23T18:29:54.558 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:29:54.558 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule remove 2026-03-23T18:29:54.585 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror snapshot schedule ls -R --format json 2026-03-23T18:29:54.608 INFO:tasks.workunit.client.0.vm04.stderr:+ test '[]' = '[]' 2026-03-23T18:29:54.609 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-23T18:29:54.609 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:29:54.672 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:29:54.735 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:29:54.798 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:29:54.863 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:29:54.930 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:29:54.996 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:29:55.062 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:29:55.128 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:29:55.192 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:29:55.256 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:29:55.322 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:29:55.388 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:29:55.453 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:29:55.515 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:29:55.576 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:29:55.640 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:29:55.704 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:29:55.767 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool rm rbd2 rbd2 --yes-i-really-really-mean-it 2026-03-23T18:29:56.638 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd2' does not exist 2026-03-23T18:29:56.650 INFO:tasks.workunit.client.0.vm04.stdout:testing recovery of mirror snapshot scheduler after module's RADOS client is blocklisted... 2026-03-23T18:29:56.651 INFO:tasks.workunit.client.0.vm04.stderr:+ test_mirror_snapshot_schedule_recovery 2026-03-23T18:29:56.651 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing recovery of mirror snapshot scheduler after module'\''s RADOS client is blocklisted...' 2026-03-23T18:29:56.651 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-23T18:29:56.651 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:29:56.720 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:29:56.783 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:29:56.847 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:29:56.909 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:29:56.975 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:29:57.040 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:29:57.103 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:29:57.170 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:29:57.236 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:29:57.300 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:29:57.566 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:29:57.635 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:29:57.702 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:29:57.766 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:29:57.828 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:29:57.890 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:29:57.954 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:29:58.018 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool create rbd3 8 2026-03-23T18:29:58.644 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd3' already exists 2026-03-23T18:29:58.656 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd pool init rbd3 2026-03-23T18:29:58.873 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:29:58.875+0000 7f59bf7fe640 0 --2- 192.168.123.104:0/3763743445 >> v2:192.168.123.104:3300/0 conn(0x562283466e90 0x562283467990 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 2026-03-23T18:30:01.617 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd namespace create rbd3/ns1 2026-03-23T18:30:01.645 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror pool enable rbd3 image 2026-03-23T18:30:01.672 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror pool enable rbd3/ns1 image 2026-03-23T18:30:01.697 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror pool peer add rbd3 cluster1 2026-03-23T18:30:01.719 INFO:tasks.workunit.client.0.vm04.stdout:a558befe-651a-42c7-aba3-875aca908091 2026-03-23T18:30:01.721 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 -s 1 rbd3/ns1/test1 2026-03-23T18:30:01.749 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror image enable rbd3/ns1/test1 snapshot 2026-03-23T18:30:02.623 INFO:tasks.workunit.client.0.vm04.stdout:Mirroring enabled 2026-03-23T18:30:02.629 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror image status rbd3/ns1/test1 2026-03-23T18:30:02.629 INFO:tasks.workunit.client.0.vm04.stderr:++ grep -c mirror.primary 2026-03-23T18:30:02.659 INFO:tasks.workunit.client.0.vm04.stderr:+ test 1 = 1 2026-03-23T18:30:02.659 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule add -p rbd3/ns1 --image test1 1m 2026-03-23T18:30:02.691 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror snapshot schedule ls -p rbd3/ns1 --image test1 2026-03-23T18:30:02.720 INFO:tasks.workunit.client.0.vm04.stderr:+ test 'every 1m' = 'every 1m' 2026-03-23T18:30:02.720 INFO:tasks.workunit.client.0.vm04.stderr:++ ceph mgr dump 2026-03-23T18:30:02.720 INFO:tasks.workunit.client.0.vm04.stderr:++ jq 'select(.name == "rbd_support")' 2026-03-23T18:30:02.720 INFO:tasks.workunit.client.0.vm04.stderr:++ jq '.active_clients[]' 2026-03-23T18:30:02.720 INFO:tasks.workunit.client.0.vm04.stderr:++ jq -r '[.addrvec[0].addr, "/", .addrvec[0].nonce|tostring] | add' 2026-03-23T18:30:02.978 INFO:tasks.workunit.client.0.vm04.stderr:+ CLIENT_ADDR=192.168.123.104:0/297303125 2026-03-23T18:30:02.978 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd blocklist add 192.168.123.104:0/297303125 2026-03-23T18:30:04.619 INFO:tasks.workunit.client.0.vm04.stderr:blocklisting 192.168.123.104:0/297303125 until 2026-03-23T19:30:03.672500+0000 (3600 sec) 2026-03-23T18:30:04.634 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd mirror snapshot schedule add -p rbd3/ns1 --image test1 2m 2026-03-23T18:30:04.634 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule add -p rbd3/ns1 --image test1 2m 2026-03-23T18:30:04.660 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-23T18:30:04.659+0000 7f9fbb311640 -1 librbd::api::Namespace: list: error listing namespaces: (108) Cannot send after transport endpoint shutdown 2026-03-23T18:30:04.660 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-23T18:30:04.659+0000 7f9fbb311640 -1 mgr.server reply reply (11) Resource temporarily unavailable [errno 108] RBD connection was shutdown (error listing namespaces) 2026-03-23T18:30:04.661 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd mirror snapshot schedule add failed: (11) Resource temporarily unavailable: [errno 108] RBD connection was shutdown (error listing namespaces) 2026-03-23T18:30:04.663 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:30:04.663 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:30:14.664 INFO:tasks.workunit.client.0.vm04.stderr:++ seq 24 2026-03-23T18:30:14.665 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 24` 2026-03-23T18:30:14.665 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule add -p rbd3/ns1 --image test1 2m 2026-03-23T18:30:14.687 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-23T18:30:14.687+0000 7f9fbb311640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-23T18:30:14.687 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd mirror snapshot schedule add failed: (11) Resource temporarily unavailable: rbd_support module is not ready, try again 2026-03-23T18:30:14.690 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:30:24.691 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 24` 2026-03-23T18:30:24.691 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule add -p rbd3/ns1 --image test1 2m 2026-03-23T18:30:24.713 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-23T18:30:24.711+0000 7f9fbb311640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-23T18:30:24.714 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd mirror snapshot schedule add failed: (11) Resource temporarily unavailable: rbd_support module is not ready, try again 2026-03-23T18:30:24.716 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:30:34.717 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 24` 2026-03-23T18:30:34.717 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule add -p rbd3/ns1 --image test1 2m 2026-03-23T18:30:34.740 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-23T18:30:34.739+0000 7f9fbb311640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-23T18:30:34.740 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd mirror snapshot schedule add failed: (11) Resource temporarily unavailable: rbd_support module is not ready, try again 2026-03-23T18:30:34.743 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:30:44.744 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 24` 2026-03-23T18:30:44.744 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule add -p rbd3/ns1 --image test1 2m 2026-03-23T18:30:44.767 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-23T18:30:44.763+0000 7f9fbb311640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-23T18:30:44.767 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd mirror snapshot schedule add failed: (11) Resource temporarily unavailable: rbd_support module is not ready, try again 2026-03-23T18:30:44.769 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:30:54.770 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 24` 2026-03-23T18:30:54.770 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule add -p rbd3/ns1 --image test1 2m 2026-03-23T18:30:54.802 INFO:tasks.workunit.client.0.vm04.stderr:+ break 2026-03-23T18:30:54.802 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls -p rbd3/ns1 --image test1 2026-03-23T18:30:54.802 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 2m' 2026-03-23T18:30:54.831 INFO:tasks.workunit.client.0.vm04.stdout:every 1m, every 2m 2026-03-23T18:30:54.831 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls -p rbd3/ns1 --image test1 2026-03-23T18:30:54.831 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 1m' 2026-03-23T18:30:54.860 INFO:tasks.workunit.client.0.vm04.stdout:every 1m, every 2m 2026-03-23T18:30:54.860 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule rm -p rbd3/ns1 --image test1 2m 2026-03-23T18:30:54.891 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule rm -p rbd3/ns1 --image test1 1m 2026-03-23T18:30:54.922 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls -p rbd3/ns1 --image test1 2026-03-23T18:30:54.922 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail grep 'every 2m' 2026-03-23T18:30:54.922 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 2m' 2026-03-23T18:30:54.952 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:30:54.952 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls -p rbd3/ns1 --image test1 2026-03-23T18:30:54.952 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail grep 'every 1m' 2026-03-23T18:30:54.952 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 1m' 2026-03-23T18:30:54.981 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:30:54.981 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap purge rbd3/ns1/test1 2026-03-23T18:30:55.008 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm rbd3/ns1/test1 2026-03-23T18:30:55.126 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:30:55.123+0000 7f3ef6b18640 0 -- 192.168.123.104:0/1791878751 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7f3ed405d0c0 msgr2=0x7f3ed407d4c0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:30:55.130 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:30:55.127+0000 7f3ef6b18640 0 -- 192.168.123.104:0/1791878751 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x55c2b7248b20 msgr2=0x7f3ed409ebf0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:30:55.136 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-23T18:30:55.139 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool rm rbd3 rbd3 --yes-i-really-really-mean-it 2026-03-23T18:30:56.170 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd3' does not exist 2026-03-23T18:30:56.182 INFO:tasks.workunit.client.0.vm04.stdout:testing perf image iostat... 2026-03-23T18:30:56.182 INFO:tasks.workunit.client.0.vm04.stderr:+ test_perf_image_iostat 2026-03-23T18:30:56.182 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing perf image iostat...' 2026-03-23T18:30:56.182 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-23T18:30:56.182 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:30:56.249 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:30:56.312 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:30:56.376 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:30:56.642 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:30:56.707 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:30:56.773 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:30:56.836 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:30:56.900 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:30:56.965 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:30:57.029 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:30:57.095 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:30:57.161 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:30:57.226 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:30:57.291 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:30:57.354 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:30:57.416 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:30:57.481 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:30:57.544 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool create rbd1 8 2026-03-23T18:30:58.188 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd1' already exists 2026-03-23T18:30:58.200 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd pool init rbd1 2026-03-23T18:31:01.157 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd namespace create rbd1/ns 2026-03-23T18:31:01.183 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool create rbd2 8 2026-03-23T18:31:02.208 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd2' already exists 2026-03-23T18:31:02.220 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd pool init rbd2 2026-03-23T18:31:05.176 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd namespace create rbd2/ns 2026-03-23T18:31:05.204 INFO:tasks.workunit.client.0.vm04.stderr:+ IMAGE_SPECS=("test1" "rbd1/test2" "rbd1/ns/test3" "rbd2/test4" "rbd2/ns/test5") 2026-03-23T18:31:05.204 INFO:tasks.workunit.client.0.vm04.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-23T18:31:05.204 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 10G --rbd-default-data-pool '' test1 2026-03-23T18:31:05.240 INFO:tasks.workunit.client.0.vm04.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-23T18:31:05.241 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 10G --rbd-default-data-pool '' rbd1/test2 2026-03-23T18:31:05.271 INFO:tasks.workunit.client.0.vm04.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-23T18:31:05.271 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 10G --rbd-default-data-pool '' rbd1/ns/test3 2026-03-23T18:31:05.301 INFO:tasks.workunit.client.0.vm04.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-23T18:31:05.301 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 10G --rbd-default-data-pool '' rbd2/test4 2026-03-23T18:31:05.331 INFO:tasks.workunit.client.0.vm04.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-23T18:31:05.331 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 10G --rbd-default-data-pool '' rbd2/ns/test5 2026-03-23T18:31:05.362 INFO:tasks.workunit.client.0.vm04.stderr:+ BENCH_PIDS=() 2026-03-23T18:31:05.362 INFO:tasks.workunit.client.0.vm04.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-23T18:31:05.362 INFO:tasks.workunit.client.0.vm04.stderr:+ BENCH_PIDS+=($!) 2026-03-23T18:31:05.362 INFO:tasks.workunit.client.0.vm04.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-23T18:31:05.362 INFO:tasks.workunit.client.0.vm04.stderr:+ BENCH_PIDS+=($!) 2026-03-23T18:31:05.362 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-pattern rand --io-total 10G --io-threads 1 --rbd-cache false test1 2026-03-23T18:31:05.362 INFO:tasks.workunit.client.0.vm04.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-23T18:31:05.362 INFO:tasks.workunit.client.0.vm04.stderr:+ BENCH_PIDS+=($!) 2026-03-23T18:31:05.362 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-pattern rand --io-total 10G --io-threads 1 --rbd-cache false rbd1/test2 2026-03-23T18:31:05.362 INFO:tasks.workunit.client.0.vm04.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-23T18:31:05.362 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-pattern rand --io-total 10G --io-threads 1 --rbd-cache false rbd1/ns/test3 2026-03-23T18:31:05.362 INFO:tasks.workunit.client.0.vm04.stderr:+ BENCH_PIDS+=($!) 2026-03-23T18:31:05.362 INFO:tasks.workunit.client.0.vm04.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-23T18:31:05.362 INFO:tasks.workunit.client.0.vm04.stderr:+ BENCH_PIDS+=($!) 2026-03-23T18:31:05.363 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-pattern rand --io-total 10G --io-threads 1 --rbd-cache false rbd2/ns/test5 2026-03-23T18:31:05.364 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd perf image iostat --format json rbd1 2026-03-23T18:31:05.364 INFO:tasks.workunit.client.0.vm04.stderr:++ jq -r 'map(.image) | sort | join(" ")' 2026-03-23T18:31:05.364 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-pattern rand --io-total 10G --io-threads 1 --rbd-cache false rbd2/test4 2026-03-23T18:31:05.409 INFO:tasks.workunit.client.0.vm04.stderr:rbd: waiting for initial image stats 2026-03-23T18:31:05.410 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-23T18:31:15.416 INFO:tasks.workunit.client.0.vm04.stderr:+ test test2 = test2 2026-03-23T18:31:15.417 INFO:tasks.workunit.client.0.vm04.stderr:++ jq -r 'map(.image) | sort | join(" ")' 2026-03-23T18:31:15.418 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd perf image iostat --format json rbd1/ns 2026-03-23T18:31:15.502 INFO:tasks.workunit.client.0.vm04.stderr:rbd: waiting for initial image stats 2026-03-23T18:31:15.502 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-23T18:31:25.509 INFO:tasks.workunit.client.0.vm04.stderr:+ test test3 = test3 2026-03-23T18:31:25.510 INFO:tasks.workunit.client.0.vm04.stderr:++ jq -r 'map(.image) | sort | join(" ")' 2026-03-23T18:31:25.512 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd perf image iostat --format json --rbd-default-pool rbd1 /ns 2026-03-23T18:31:25.572 INFO:tasks.workunit.client.0.vm04.stderr:+ test test3 = test3 2026-03-23T18:31:25.572 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd perf image iostat --format json --pool rbd2 2026-03-23T18:31:25.573 INFO:tasks.workunit.client.0.vm04.stderr:++ jq -r 'map(.image) | sort | join(" ")' 2026-03-23T18:31:25.643 INFO:tasks.workunit.client.0.vm04.stderr:rbd: waiting for initial image stats 2026-03-23T18:31:25.643 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-23T18:31:35.649 INFO:tasks.workunit.client.0.vm04.stderr:+ test test4 = test4 2026-03-23T18:31:35.649 INFO:tasks.workunit.client.0.vm04.stderr:++ jq -r 'map(.image) | sort | join(" ")' 2026-03-23T18:31:35.654 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd perf image iostat --format json --pool rbd2 --namespace ns 2026-03-23T18:31:35.719 INFO:tasks.workunit.client.0.vm04.stderr:rbd: waiting for initial image stats 2026-03-23T18:31:35.719 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-23T18:31:45.769 INFO:tasks.workunit.client.0.vm04.stderr:+ test test5 = test5 2026-03-23T18:31:45.770 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd perf image iostat --format json --rbd-default-pool rbd2 --namespace ns 2026-03-23T18:31:45.770 INFO:tasks.workunit.client.0.vm04.stderr:++ jq -r 'map(.image) | sort | join(" ")' 2026-03-23T18:31:46.006 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:31:46.003+0000 7f3d53c95640 0 --2- 192.168.123.104:0/449253237 >> v2:192.168.123.104:3300/0 conn(0x55deee8cff10 0x55deee8f02f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 2026-03-23T18:31:47.016 INFO:tasks.workunit.client.0.vm04.stderr:+ test test5 = test5 2026-03-23T18:31:47.016 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd perf image iostat --format json 2026-03-23T18:31:47.016 INFO:tasks.workunit.client.0.vm04.stderr:++ jq -r 'map(.image) | sort | join(" ")' 2026-03-23T18:31:47.081 INFO:tasks.workunit.client.0.vm04.stderr:rbd: waiting for initial image stats 2026-03-23T18:31:47.081 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-23T18:32:02.091 INFO:tasks.workunit.client.0.vm04.stderr:+ test 'test1 test2 test3 test4 test5' = 'test1 test2 test3 test4 test5' 2026-03-23T18:32:02.091 INFO:tasks.workunit.client.0.vm04.stderr:+ for pid in "${BENCH_PIDS[@]}" 2026-03-23T18:32:02.091 INFO:tasks.workunit.client.0.vm04.stderr:+ kill 83248 2026-03-23T18:32:02.091 INFO:tasks.workunit.client.0.vm04.stderr:+ for pid in "${BENCH_PIDS[@]}" 2026-03-23T18:32:02.091 INFO:tasks.workunit.client.0.vm04.stderr:+ kill 83249 2026-03-23T18:32:02.091 INFO:tasks.workunit.client.0.vm04.stderr:+ for pid in "${BENCH_PIDS[@]}" 2026-03-23T18:32:02.091 INFO:tasks.workunit.client.0.vm04.stderr:+ kill 83250 2026-03-23T18:32:02.091 INFO:tasks.workunit.client.0.vm04.stderr:+ for pid in "${BENCH_PIDS[@]}" 2026-03-23T18:32:02.091 INFO:tasks.workunit.client.0.vm04.stderr:+ kill 83251 2026-03-23T18:32:02.091 INFO:tasks.workunit.client.0.vm04.stderr:+ for pid in "${BENCH_PIDS[@]}" 2026-03-23T18:32:02.091 INFO:tasks.workunit.client.0.vm04.stderr:+ kill 83252 2026-03-23T18:32:02.092 INFO:tasks.workunit.client.0.vm04.stderr:+ wait 2026-03-23T18:32:02.128 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-23T18:32:02.128 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:32:02.203 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:32:02.272 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:32:02.344 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:32:02.412 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:32:02.485 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:32:02.555 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:32:02.625 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:32:02.694 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:32:02.762 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:32:02.832 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:32:02.903 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:32:02.973 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:32:03.043 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:32:05.438 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:32:05.718 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:32:05.793 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:32:05.865 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:32:05.937 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool rm rbd2 rbd2 --yes-i-really-really-mean-it 2026-03-23T18:32:06.281 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd2' does not exist 2026-03-23T18:32:06.308 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool rm rbd1 rbd1 --yes-i-really-really-mean-it 2026-03-23T18:32:07.263 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd1' does not exist 2026-03-23T18:32:07.294 INFO:tasks.workunit.client.0.vm04.stderr:+ test_perf_image_iostat_recovery 2026-03-23T18:32:07.294 INFO:tasks.workunit.client.0.vm04.stdout:testing recovery of perf handler after module's RADOS client is blocklisted... 2026-03-23T18:32:07.294 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing recovery of perf handler after module'\''s RADOS client is blocklisted...' 2026-03-23T18:32:07.294 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-23T18:32:07.294 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:32:07.471 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:32:07.544 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:32:07.617 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:32:07.689 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:32:07.764 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:32:07.836 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:32:07.908 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:32:07.982 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:32:08.053 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:32:08.124 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:32:08.200 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:32:08.273 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:32:08.346 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:32:08.420 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:32:08.491 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:32:08.563 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:32:08.634 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:32:08.704 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool create rbd3 8 2026-03-23T18:32:09.240 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd3' already exists 2026-03-23T18:32:09.252 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd pool init rbd3 2026-03-23T18:32:12.208 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd namespace create rbd3/ns 2026-03-23T18:32:12.236 INFO:tasks.workunit.client.0.vm04.stderr:+ IMAGE_SPECS=("rbd3/test1" "rbd3/ns/test2") 2026-03-23T18:32:12.236 INFO:tasks.workunit.client.0.vm04.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-23T18:32:12.236 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 10G --rbd-default-data-pool '' rbd3/test1 2026-03-23T18:32:12.265 INFO:tasks.workunit.client.0.vm04.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-23T18:32:12.265 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 10G --rbd-default-data-pool '' rbd3/ns/test2 2026-03-23T18:32:12.300 INFO:tasks.workunit.client.0.vm04.stderr:+ BENCH_PIDS=() 2026-03-23T18:32:12.300 INFO:tasks.workunit.client.0.vm04.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-23T18:32:12.300 INFO:tasks.workunit.client.0.vm04.stderr:+ BENCH_PIDS+=($!) 2026-03-23T18:32:12.300 INFO:tasks.workunit.client.0.vm04.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-23T18:32:12.300 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-pattern rand --io-total 10G --io-threads 1 --rbd-cache false rbd3/test1 2026-03-23T18:32:12.300 INFO:tasks.workunit.client.0.vm04.stderr:+ BENCH_PIDS+=($!) 2026-03-23T18:32:12.300 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-pattern rand --io-total 10G --io-threads 1 --rbd-cache false rbd3/ns/test2 2026-03-23T18:32:12.300 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd perf image iostat --format json rbd3 2026-03-23T18:32:12.301 INFO:tasks.workunit.client.0.vm04.stderr:++ jq -r 'map(.image) | sort | join(" ")' 2026-03-23T18:32:12.330 INFO:tasks.workunit.client.0.vm04.stderr:rbd: waiting for initial image stats 2026-03-23T18:32:12.330 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-23T18:32:22.338 INFO:tasks.workunit.client.0.vm04.stderr:+ test test1 = test1 2026-03-23T18:32:22.339 INFO:tasks.workunit.client.0.vm04.stderr:++ ceph mgr dump 2026-03-23T18:32:22.339 INFO:tasks.workunit.client.0.vm04.stderr:++ jq '.active_clients[]' 2026-03-23T18:32:22.347 INFO:tasks.workunit.client.0.vm04.stderr:++ jq 'select(.name == "rbd_support")' 2026-03-23T18:32:22.348 INFO:tasks.workunit.client.0.vm04.stderr:++ jq -r '[.addrvec[0].addr, "/", .addrvec[0].nonce|tostring] | add' 2026-03-23T18:32:22.756 INFO:tasks.workunit.client.0.vm04.stderr:+ CLIENT_ADDR=192.168.123.104:0/1511835233 2026-03-23T18:32:22.756 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd blocklist add 192.168.123.104:0/1511835233 2026-03-23T18:32:24.897 INFO:tasks.workunit.client.0.vm04.stderr:blocklisting 192.168.123.104:0/1511835233 until 2026-03-23T19:32:23.968642+0000 (3600 sec) 2026-03-23T18:32:24.916 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd perf image iostat --format json rbd3/ns 2026-03-23T18:32:24.916 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd perf image iostat --format json rbd3/ns 2026-03-23T18:32:24.975 INFO:tasks.workunit.client.0.vm04.stderr:rbd: waiting for initial image stats 2026-03-23T18:32:24.975 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-23T18:32:29.978 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-23T18:32:29.975+0000 7f9fb4b04640 -1 librbd::api::Image: list_images_v2: error listing image in directory: (108) Cannot send after transport endpoint shutdown 2026-03-23T18:32:29.979 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-23T18:32:29.975+0000 7f9fb4b04640 -1 librbd::api::Image: list_images: error listing v2 images: (108) Cannot send after transport endpoint shutdown 2026-03-23T18:32:29.979 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-23T18:32:29.979+0000 7f9fbb311640 -1 mgr.server reply reply (2) No such file or directory '30cca74d0d96' 2026-03-23T18:32:29.979 INFO:tasks.workunit.client.0.vm04.stderr:rbd: mgr command failed: (2) No such file or directory: '30cca74d0d96' 2026-03-23T18:32:29.983 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:32:29.983 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:32:39.986 INFO:tasks.workunit.client.0.vm04.stderr:++ seq 24 2026-03-23T18:32:39.987 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 24` 2026-03-23T18:32:39.988 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd perf image iostat --format json rbd3/ns 2026-03-23T18:32:39.989 INFO:tasks.workunit.client.0.vm04.stderr:++ jq -r 'map(.image) | sort | join(" ")' 2026-03-23T18:32:40.030 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-23T18:32:40.027+0000 7f9fbb311640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-23T18:32:40.030 INFO:tasks.workunit.client.0.vm04.stderr:rbd: mgr command failed: (11) Resource temporarily unavailable: rbd_support module is not ready, try again 2026-03-23T18:32:40.033 INFO:tasks.workunit.client.0.vm04.stderr:+ test '' = test2 2026-03-23T18:32:40.033 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:32:50.034 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 24` 2026-03-23T18:32:50.035 INFO:tasks.workunit.client.0.vm04.stderr:++ jq -r 'map(.image) | sort | join(" ")' 2026-03-23T18:32:50.036 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd perf image iostat --format json rbd3/ns 2026-03-23T18:32:50.097 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-23T18:32:50.095+0000 7f9fbb311640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-23T18:32:50.098 INFO:tasks.workunit.client.0.vm04.stderr:rbd: mgr command failed: (11) Resource temporarily unavailable: rbd_support module is not ready, try again 2026-03-23T18:32:50.105 INFO:tasks.workunit.client.0.vm04.stderr:+ test '' = test2 2026-03-23T18:32:50.105 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:33:00.107 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 24` 2026-03-23T18:33:00.110 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd perf image iostat --format json rbd3/ns 2026-03-23T18:33:00.110 INFO:tasks.workunit.client.0.vm04.stderr:++ jq -r 'map(.image) | sort | join(" ")' 2026-03-23T18:33:00.161 INFO:tasks.workunit.client.0.vm04.stderr:rbd: waiting for initial image stats 2026-03-23T18:33:00.161 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-23T18:33:10.166 INFO:tasks.workunit.client.0.vm04.stderr:+ test test2 = test2 2026-03-23T18:33:10.166 INFO:tasks.workunit.client.0.vm04.stderr:+ break 2026-03-23T18:33:10.166 INFO:tasks.workunit.client.0.vm04.stderr:+ for pid in "${BENCH_PIDS[@]}" 2026-03-23T18:33:10.167 INFO:tasks.workunit.client.0.vm04.stderr:+ kill 85182 2026-03-23T18:33:10.167 INFO:tasks.workunit.client.0.vm04.stderr:+ for pid in "${BENCH_PIDS[@]}" 2026-03-23T18:33:10.167 INFO:tasks.workunit.client.0.vm04.stderr:+ kill 85183 2026-03-23T18:33:10.167 INFO:tasks.workunit.client.0.vm04.stderr:+ wait 2026-03-23T18:33:10.190 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-23T18:33:10.190 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:10.261 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:10.330 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:10.399 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:10.469 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:10.545 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:10.617 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:10.687 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:10.758 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:10.827 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:10.899 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:10.971 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:11.045 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:11.123 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:11.196 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:11.267 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:11.337 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:11.408 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:11.479 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool rm rbd3 rbd3 --yes-i-really-really-mean-it 2026-03-23T18:33:12.320 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd3' does not exist 2026-03-23T18:33:12.371 INFO:tasks.workunit.client.0.vm04.stdout:testing mirror pool peer bootstrap create... 2026-03-23T18:33:12.371 INFO:tasks.workunit.client.0.vm04.stderr:+ test_mirror_pool_peer_bootstrap_create 2026-03-23T18:33:12.371 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing mirror pool peer bootstrap create...' 2026-03-23T18:33:12.371 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-23T18:33:12.371 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:12.550 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:12.619 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:12.690 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:12.759 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:12.832 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:12.903 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:12.974 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:13.045 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:13.117 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:13.237 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:13.312 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:13.388 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:13.461 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:13.534 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:13.603 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:13.676 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:13.758 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:13.828 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool create rbd1 8 2026-03-23T18:33:14.199 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd1' already exists 2026-03-23T18:33:14.212 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd pool init rbd1 2026-03-23T18:33:17.171 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror pool enable rbd1 image 2026-03-23T18:33:17.198 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool create rbd2 8 2026-03-23T18:33:18.220 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd2' already exists 2026-03-23T18:33:18.232 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd pool init rbd2 2026-03-23T18:33:21.435 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror pool enable rbd2 pool 2026-03-23T18:33:21.465 INFO:tasks.workunit.client.0.vm04.stderr:+ readarray -t MON_ADDRS 2026-03-23T18:33:21.465 INFO:tasks.workunit.client.0.vm04.stderr:++ ceph mon dump 2026-03-23T18:33:21.466 INFO:tasks.workunit.client.0.vm04.stderr:++ sed -n 's/^[0-9]: \(.*\) mon\.[a-z]$/\1/p' 2026-03-23T18:33:21.725 INFO:tasks.workunit.client.0.vm04.stderr:dumped monmap epoch 1 2026-03-23T18:33:21.737 INFO:tasks.workunit.client.0.vm04.stderr:+ BAD_MON_ADDR=1.2.3.4:6789 2026-03-23T18:33:21.737 INFO:tasks.workunit.client.0.vm04.stderr:+ MON_HOST='[v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0],1.2.3.4:6789' 2026-03-23T18:33:21.737 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror pool peer bootstrap create --mon-host '[v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0],1.2.3.4:6789' rbd1 2026-03-23T18:33:21.737 INFO:tasks.workunit.client.0.vm04.stderr:++ base64 -d 2026-03-23T18:33:21.766 INFO:tasks.workunit.client.0.vm04.stderr:+ TOKEN='{"fsid":"bd8fd485-7f5e-4dd1-a4fb-2f27337796a2","client_id":"rbd-mirror-peer","key":"AQBxh8FpnmWELRAAH/xJ4GtoUNQ0bHwNcIcyVg==","mon_host":"[v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0]"}' 2026-03-23T18:33:21.767 INFO:tasks.workunit.client.0.vm04.stderr:++ jq -r .fsid 2026-03-23T18:33:21.776 INFO:tasks.workunit.client.0.vm04.stderr:+ TOKEN_FSID=bd8fd485-7f5e-4dd1-a4fb-2f27337796a2 2026-03-23T18:33:21.777 INFO:tasks.workunit.client.0.vm04.stderr:++ jq -r .client_id 2026-03-23T18:33:21.786 INFO:tasks.workunit.client.0.vm04.stderr:+ TOKEN_CLIENT_ID=rbd-mirror-peer 2026-03-23T18:33:21.786 INFO:tasks.workunit.client.0.vm04.stderr:++ jq -r .key 2026-03-23T18:33:21.796 INFO:tasks.workunit.client.0.vm04.stderr:+ TOKEN_KEY=AQBxh8FpnmWELRAAH/xJ4GtoUNQ0bHwNcIcyVg== 2026-03-23T18:33:21.796 INFO:tasks.workunit.client.0.vm04.stderr:++ jq -r .mon_host 2026-03-23T18:33:21.805 INFO:tasks.workunit.client.0.vm04.stderr:+ TOKEN_MON_HOST='[v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0]' 2026-03-23T18:33:21.805 INFO:tasks.workunit.client.0.vm04.stderr:++ ceph fsid 2026-03-23T18:33:22.077 INFO:tasks.workunit.client.0.vm04.stderr:+ test bd8fd485-7f5e-4dd1-a4fb-2f27337796a2 = bd8fd485-7f5e-4dd1-a4fb-2f27337796a2 2026-03-23T18:33:22.077 INFO:tasks.workunit.client.0.vm04.stderr:++ ceph auth get-key client.rbd-mirror-peer 2026-03-23T18:33:22.340 INFO:tasks.workunit.client.0.vm04.stderr:+ test AQBxh8FpnmWELRAAH/xJ4GtoUNQ0bHwNcIcyVg== = AQBxh8FpnmWELRAAH/xJ4GtoUNQ0bHwNcIcyVg== 2026-03-23T18:33:22.340 INFO:tasks.workunit.client.0.vm04.stderr:+ for addr in "${MON_ADDRS[@]}" 2026-03-23T18:33:22.340 INFO:tasks.workunit.client.0.vm04.stderr:+ fgrep '[v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0]' 2026-03-23T18:33:22.342 INFO:tasks.workunit.client.0.vm04.stdout:[v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] 2026-03-23T18:33:22.342 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail fgrep 1.2.3.4:6789 2026-03-23T18:33:22.342 INFO:tasks.workunit.client.0.vm04.stderr:+ fgrep 1.2.3.4:6789 2026-03-23T18:33:22.344 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:33:22.344 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror pool peer bootstrap create --mon-host '[v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0],1.2.3.4:6789' rbd1 2026-03-23T18:33:22.344 INFO:tasks.workunit.client.0.vm04.stderr:++ base64 -d 2026-03-23T18:33:22.369 INFO:tasks.workunit.client.0.vm04.stderr:+ test '{"fsid":"bd8fd485-7f5e-4dd1-a4fb-2f27337796a2","client_id":"rbd-mirror-peer","key":"AQBxh8FpnmWELRAAH/xJ4GtoUNQ0bHwNcIcyVg==","mon_host":"[v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0]"}' = '{"fsid":"bd8fd485-7f5e-4dd1-a4fb-2f27337796a2","client_id":"rbd-mirror-peer","key":"AQBxh8FpnmWELRAAH/xJ4GtoUNQ0bHwNcIcyVg==","mon_host":"[v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0]"}' 2026-03-23T18:33:22.369 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror pool peer bootstrap create rbd1 2026-03-23T18:33:22.369 INFO:tasks.workunit.client.0.vm04.stderr:++ base64 -d 2026-03-23T18:33:22.394 INFO:tasks.workunit.client.0.vm04.stderr:+ test '{"fsid":"bd8fd485-7f5e-4dd1-a4fb-2f27337796a2","client_id":"rbd-mirror-peer","key":"AQBxh8FpnmWELRAAH/xJ4GtoUNQ0bHwNcIcyVg==","mon_host":"[v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0]"}' = '{"fsid":"bd8fd485-7f5e-4dd1-a4fb-2f27337796a2","client_id":"rbd-mirror-peer","key":"AQBxh8FpnmWELRAAH/xJ4GtoUNQ0bHwNcIcyVg==","mon_host":"[v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0]"}' 2026-03-23T18:33:22.394 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror pool peer bootstrap create --mon-host '[v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0],1.2.3.4:6789' rbd2 2026-03-23T18:33:22.394 INFO:tasks.workunit.client.0.vm04.stderr:++ base64 -d 2026-03-23T18:33:22.418 INFO:tasks.workunit.client.0.vm04.stderr:+ test '{"fsid":"bd8fd485-7f5e-4dd1-a4fb-2f27337796a2","client_id":"rbd-mirror-peer","key":"AQBxh8FpnmWELRAAH/xJ4GtoUNQ0bHwNcIcyVg==","mon_host":"[v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0]"}' = '{"fsid":"bd8fd485-7f5e-4dd1-a4fb-2f27337796a2","client_id":"rbd-mirror-peer","key":"AQBxh8FpnmWELRAAH/xJ4GtoUNQ0bHwNcIcyVg==","mon_host":"[v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0]"}' 2026-03-23T18:33:22.418 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror pool peer bootstrap create rbd2 2026-03-23T18:33:22.418 INFO:tasks.workunit.client.0.vm04.stderr:++ base64 -d 2026-03-23T18:33:22.443 INFO:tasks.workunit.client.0.vm04.stderr:+ test '{"fsid":"bd8fd485-7f5e-4dd1-a4fb-2f27337796a2","client_id":"rbd-mirror-peer","key":"AQBxh8FpnmWELRAAH/xJ4GtoUNQ0bHwNcIcyVg==","mon_host":"[v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0]"}' = '{"fsid":"bd8fd485-7f5e-4dd1-a4fb-2f27337796a2","client_id":"rbd-mirror-peer","key":"AQBxh8FpnmWELRAAH/xJ4GtoUNQ0bHwNcIcyVg==","mon_host":"[v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0]"}' 2026-03-23T18:33:22.443 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool rm rbd2 rbd2 --yes-i-really-really-mean-it 2026-03-23T18:33:23.493 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd2' does not exist 2026-03-23T18:33:23.507 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool rm rbd1 rbd1 --yes-i-really-really-mean-it 2026-03-23T18:33:24.489 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd1' does not exist 2026-03-23T18:33:24.502 INFO:tasks.workunit.client.0.vm04.stdout:testing removing pool under running tasks... 2026-03-23T18:33:24.503 INFO:tasks.workunit.client.0.vm04.stderr:+ test_tasks_removed_pool 2026-03-23T18:33:24.503 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing removing pool under running tasks...' 2026-03-23T18:33:24.503 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-23T18:33:24.503 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:24.577 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:24.647 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:24.717 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:24.788 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:24.862 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:24.935 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:25.008 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:25.081 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:25.152 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:25.224 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:25.296 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:25.368 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:25.446 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:25.517 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:25.586 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:25.656 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:25.728 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:25.798 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool create rbd2 8 2026-03-23T18:33:26.500 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd2' already exists 2026-03-23T18:33:26.512 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd pool init rbd2 2026-03-23T18:33:29.472 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 1G foo 2026-03-23T18:33:29.515 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create foo@snap 2026-03-23T18:33:30.483 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-23T18:33:30.493 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap protect foo@snap 2026-03-23T18:33:30.526 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone foo@snap bar 2026-03-23T18:33:30.580 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 1G rbd2/dummy 2026-03-23T18:33:30.610 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-pattern seq --io-size 1M --io-total 1G rbd2/dummy 2026-03-23T18:33:30.639 INFO:tasks.workunit.client.0.vm04.stdout:bench type write io_size 1048576 io_threads 16 bytes 1073741824 pattern sequential 2026-03-23T18:33:31.652 INFO:tasks.workunit.client.0.vm04.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-23T18:33:31.652 INFO:tasks.workunit.client.0.vm04.stdout: 1 224 273.972 274 MiB/s 2026-03-23T18:33:32.733 INFO:tasks.workunit.client.0.vm04.stdout: 2 448 237.218 237 MiB/s 2026-03-23T18:33:33.655 INFO:tasks.workunit.client.0.vm04.stdout: 3 656 233.333 233 MiB/s 2026-03-23T18:33:34.755 INFO:tasks.workunit.client.0.vm04.stdout: 4 880 225.125 225 MiB/s 2026-03-23T18:33:35.482 INFO:tasks.workunit.client.0.vm04.stdout:elapsed: 4 ops: 1024 ops/sec: 211.57 bytes/sec: 212 MiB/s 2026-03-23T18:33:35.493 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create rbd2/dummy@snap 2026-03-23T18:33:36.090 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-23T18:33:36.096 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap protect rbd2/dummy@snap 2026-03-23T18:33:36.125 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in {1..5} 2026-03-23T18:33:36.125 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone rbd2/dummy@snap rbd2/dummy1 2026-03-23T18:33:36.163 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in {1..5} 2026-03-23T18:33:36.163 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone rbd2/dummy@snap rbd2/dummy2 2026-03-23T18:33:36.202 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in {1..5} 2026-03-23T18:33:36.202 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone rbd2/dummy@snap rbd2/dummy3 2026-03-23T18:33:36.242 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in {1..5} 2026-03-23T18:33:36.242 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone rbd2/dummy@snap rbd2/dummy4 2026-03-23T18:33:36.282 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in {1..5} 2026-03-23T18:33:36.282 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone rbd2/dummy@snap rbd2/dummy5 2026-03-23T18:33:36.321 INFO:tasks.workunit.client.0.vm04.stderr:++ ceph rbd task list 2026-03-23T18:33:36.568 INFO:tasks.workunit.client.0.vm04.stderr:+ test '[]' = '[]' 2026-03-23T18:33:36.568 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in {1..5} 2026-03-23T18:33:36.568 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph rbd task add flatten rbd2/dummy1 2026-03-23T18:33:37.259 INFO:tasks.workunit.client.0.vm04.stdout:{"sequence": 1, "id": "2ec2f7c6-b672-423a-817e-f6e085a38ebe", "message": "Flattening image rbd2/dummy1", "refs": {"action": "flatten", "pool_name": "rbd2", "pool_namespace": "", "image_name": "dummy1", "image_id": "3275740fed6e"}, "in_progress": true, "progress": 0.03515625} 2026-03-23T18:33:37.286 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in {1..5} 2026-03-23T18:33:37.286 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph rbd task add flatten rbd2/dummy2 2026-03-23T18:33:38.127 INFO:tasks.workunit.client.0.vm04.stdout:{"sequence": 2, "id": "5dc15e01-24cd-4f0a-b404-6fd6aa07e91f", "message": "Flattening image rbd2/dummy2", "refs": {"action": "flatten", "pool_name": "rbd2", "pool_namespace": "", "image_name": "dummy2", "image_id": "3278a158620b"}} 2026-03-23T18:33:38.149 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in {1..5} 2026-03-23T18:33:38.149 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph rbd task add flatten rbd2/dummy3 2026-03-23T18:33:39.448 INFO:tasks.workunit.client.0.vm04.stdout:{"sequence": 3, "id": "3be47b1a-fa72-4cd5-8f39-31225cf5c4a5", "message": "Flattening image rbd2/dummy3", "refs": {"action": "flatten", "pool_name": "rbd2", "pool_namespace": "", "image_name": "dummy3", "image_id": "327b39ea4816"}} 2026-03-23T18:33:39.477 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in {1..5} 2026-03-23T18:33:39.477 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph rbd task add flatten rbd2/dummy4 2026-03-23T18:33:40.794 INFO:tasks.workunit.client.0.vm04.stdout:{"sequence": 4, "id": "52f2179e-c6a4-4ccc-9530-c72c04d35a51", "message": "Flattening image rbd2/dummy4", "refs": {"action": "flatten", "pool_name": "rbd2", "pool_namespace": "", "image_name": "dummy4", "image_id": "327e2f75f45"}} 2026-03-23T18:33:40.824 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in {1..5} 2026-03-23T18:33:40.824 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph rbd task add flatten rbd2/dummy5 2026-03-23T18:33:41.983 INFO:tasks.workunit.client.0.vm04.stdout:{"sequence": 5, "id": "a116629f-a8f9-453b-8709-64f787a0576d", "message": "Flattening image rbd2/dummy5", "refs": {"action": "flatten", "pool_name": "rbd2", "pool_namespace": "", "image_name": "dummy5", "image_id": "3281cdaebfa8"}} 2026-03-23T18:33:41.997 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool delete rbd2 rbd2 --yes-i-really-really-mean-it 2026-03-23T18:33:42.324 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd2' does not exist 2026-03-23T18:33:42.337 INFO:tasks.workunit.client.0.vm04.stderr:++ ceph rbd task list 2026-03-23T18:33:42.586 INFO:tasks.workunit.client.0.vm04.stderr:+ test '[ 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: { 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: "id": "5dc15e01-24cd-4f0a-b404-6fd6aa07e91f", 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: "message": "Flattening image rbd2/dummy2", 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: "refs": { 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: "action": "flatten", 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: "image_id": "3278a158620b", 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: "image_name": "dummy2", 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: "pool_name": "rbd2", 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: "pool_namespace": "" 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: }, 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: "sequence": 2 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: }, 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: { 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: "id": "3be47b1a-fa72-4cd5-8f39-31225cf5c4a5", 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: "message": "Flattening image rbd2/dummy3", 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: "refs": { 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: "action": "flatten", 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: "image_id": "327b39ea4816", 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: "image_name": "dummy3", 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: "pool_name": "rbd2", 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: "pool_namespace": "" 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: }, 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: "sequence": 3 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: }, 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: { 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: "id": "52f2179e-c6a4-4ccc-9530-c72c04d35a51", 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: "message": "Flattening image rbd2/dummy4", 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: "refs": { 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: "action": "flatten", 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: "image_id": "327e2f75f45", 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: "image_name": "dummy4", 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: "pool_name": "rbd2", 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: "pool_namespace": "" 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: }, 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: "sequence": 4 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: }, 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: { 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: "id": "a116629f-a8f9-453b-8709-64f787a0576d", 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: "message": "Flattening image rbd2/dummy5", 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: "refs": { 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: "action": "flatten", 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: "image_id": "3281cdaebfa8", 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: "image_name": "dummy5", 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: "pool_name": "rbd2", 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: "pool_namespace": "" 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: }, 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: "sequence": 5 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr: } 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr:]' '!=' '[]' 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info bar 2026-03-23T18:33:42.587 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'parent: ' 2026-03-23T18:33:42.623 INFO:tasks.workunit.client.0.vm04.stdout: parent: rbd/foo@snap 2026-03-23T18:33:42.623 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd snap unprotect foo@snap 2026-03-23T18:33:42.623 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap unprotect foo@snap 2026-03-23T18:33:42.650 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:33:42.647+0000 7f9b89f40640 -1 librbd::SnapshotUnprotectRequest: cannot unprotect: at least 1 child(ren) [326618fae440] in pool 'rbd' 2026-03-23T18:33:42.651 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:33:42.647+0000 7f9b89f40640 -1 librbd::SnapshotUnprotectRequest: encountered error: (16) Device or resource busy 2026-03-23T18:33:42.651 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:33:42.647+0000 7f9b89f40640 -1 librbd::SnapshotUnprotectRequest: 0x55bf17e46bc0 should_complete_error: ret_val=-16 2026-03-23T18:33:42.652 INFO:tasks.workunit.client.0.vm04.stderr:rbd: unprotecting snap failed: 2026-03-23T18:33:42.651+0000 7f9b8973f640 -1 librbd::SnapshotUnprotectRequest: 0x55bf17e46bc0 should_complete_error: ret_val=-16 2026-03-23T18:33:42.652 INFO:tasks.workunit.client.0.vm04.stderr:(16) Device or resource busy 2026-03-23T18:33:42.656 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:33:42.656 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph rbd task add flatten bar 2026-03-23T18:33:42.933 INFO:tasks.workunit.client.0.vm04.stdout:{"sequence": 6, "id": "0c183e2c-2452-482b-bc8c-e4230717355a", "message": "Flattening image rbd/bar", "refs": {"action": "flatten", "pool_name": "rbd", "pool_namespace": "", "image_name": "bar", "image_id": "326618fae440"}, "retry_attempts": 1, "retry_time": "2026-03-23T18:34:12.882857"} 2026-03-23T18:33:42.945 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in {1..12} 2026-03-23T18:33:42.945 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info bar 2026-03-23T18:33:42.945 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'parent: ' 2026-03-23T18:33:42.971 INFO:tasks.workunit.client.0.vm04.stderr:+ break 2026-03-23T18:33:42.971 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info bar 2026-03-23T18:33:42.971 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail grep 'parent: ' 2026-03-23T18:33:42.972 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'parent: ' 2026-03-23T18:33:42.997 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:33:42.997 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap unprotect foo@snap 2026-03-23T18:33:43.033 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in {1..12} 2026-03-23T18:33:43.033 INFO:tasks.workunit.client.0.vm04.stderr:++ ceph rbd task list 2026-03-23T18:33:43.277 INFO:tasks.workunit.client.0.vm04.stderr:+ test '[]' = '[]' 2026-03-23T18:33:43.277 INFO:tasks.workunit.client.0.vm04.stderr:+ break 2026-03-23T18:33:43.277 INFO:tasks.workunit.client.0.vm04.stderr:++ ceph rbd task list 2026-03-23T18:33:43.521 INFO:tasks.workunit.client.0.vm04.stderr:+ test '[]' = '[]' 2026-03-23T18:33:43.521 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-23T18:33:43.521 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:43.594 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:43.665 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:43.735 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:43.807 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:43.881 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:43.954 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:44.026 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:44.097 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:44.167 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:44.945 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:45.017 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:45.124 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:45.197 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:45.269 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:45.341 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:45.412 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:45.484 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:45.558 INFO:tasks.workunit.client.0.vm04.stdout:testing task handler recovery after module's RADOS client is blocklisted... 2026-03-23T18:33:45.559 INFO:tasks.workunit.client.0.vm04.stderr:+ test_tasks_recovery 2026-03-23T18:33:45.559 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing task handler recovery after module'\''s RADOS client is blocklisted...' 2026-03-23T18:33:45.559 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-23T18:33:45.559 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:45.631 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:45.704 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:45.774 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:45.846 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:45.921 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:45.995 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:46.067 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:46.141 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:46.212 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:46.284 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:46.357 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:46.430 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:46.502 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:46.572 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:46.642 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:46.714 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:46.785 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-23T18:33:46.856 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool create rbd2 8 2026-03-23T18:33:47.336 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd2' already exists 2026-03-23T18:33:47.348 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd pool init rbd2 2026-03-23T18:33:50.299 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 1G rbd2/img1 2026-03-23T18:33:50.329 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-pattern seq --io-size 1M --io-total 1G rbd2/img1 2026-03-23T18:33:50.357 INFO:tasks.workunit.client.0.vm04.stdout:bench type write io_size 1048576 io_threads 16 bytes 1073741824 pattern sequential 2026-03-23T18:33:51.398 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:33:51.395+0000 7f22fbfff640 0 -- 192.168.123.104:0/2500480011 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7f22d40059a0 msgr2=0x7f22d4005df0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-23T18:33:51.398 INFO:tasks.workunit.client.0.vm04.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-23T18:33:51.398 INFO:tasks.workunit.client.0.vm04.stdout: 1 240 283.185 283 MiB/s 2026-03-23T18:33:52.420 INFO:tasks.workunit.client.0.vm04.stdout: 2 432 232.365 232 MiB/s 2026-03-23T18:33:53.365 INFO:tasks.workunit.client.0.vm04.stdout: 3 624 222.841 223 MiB/s 2026-03-23T18:33:54.408 INFO:tasks.workunit.client.0.vm04.stdout: 4 832 216.547 217 MiB/s 2026-03-23T18:33:54.449 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-23T18:33:54.447+0000 7f22fbfff640 0 -- 192.168.123.104:0/2500480011 >> [v2:192.168.123.104:6800/3728786032,v1:192.168.123.104:6801/3728786032] conn(0x7f22dc05d3e0 msgr2=0x7f22dc07d7e0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-23T18:33:55.425 INFO:tasks.workunit.client.0.vm04.stdout:elapsed: 5 ops: 1024 ops/sec: 202.052 bytes/sec: 202 MiB/s 2026-03-23T18:33:55.433 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create rbd2/img1@snap 2026-03-23T18:33:56.250 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-23T18:33:56.256 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap protect rbd2/img1@snap 2026-03-23T18:33:56.489 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone rbd2/img1@snap rbd2/clone1 2026-03-23T18:33:56.531 INFO:tasks.workunit.client.0.vm04.stderr:++ ceph mgr dump 2026-03-23T18:33:56.531 INFO:tasks.workunit.client.0.vm04.stderr:++ jq 'select(.name == "rbd_support")' 2026-03-23T18:33:56.531 INFO:tasks.workunit.client.0.vm04.stderr:++ jq '.active_clients[]' 2026-03-23T18:33:56.531 INFO:tasks.workunit.client.0.vm04.stderr:++ jq -r '[.addrvec[0].addr, "/", .addrvec[0].nonce|tostring] | add' 2026-03-23T18:33:56.798 INFO:tasks.workunit.client.0.vm04.stderr:+ CLIENT_ADDR=192.168.123.104:0/623257793 2026-03-23T18:33:56.798 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd blocklist add 192.168.123.104:0/623257793 2026-03-23T18:33:58.318 INFO:tasks.workunit.client.0.vm04.stderr:blocklisting 192.168.123.104:0/623257793 until 2026-03-23T19:33:57.374205+0000 (3600 sec) 2026-03-23T18:33:58.332 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail ceph rbd task add flatten rbd2/clone1 2026-03-23T18:33:58.332 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph rbd task add flatten rbd2/clone1 2026-03-23T18:33:58.497 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-23T18:33:58.495+0000 7f9fbb311640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-23T18:33:58.497 INFO:tasks.workunit.client.0.vm04.stderr:Error EAGAIN: rbd_support module is not ready, try again 2026-03-23T18:33:58.501 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:33:58.501 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:34:08.502 INFO:tasks.workunit.client.0.vm04.stderr:++ seq 24 2026-03-23T18:34:08.503 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 24` 2026-03-23T18:34:08.503 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph rbd task add flatten rbd2/clone1 2026-03-23T18:34:09.014 INFO:tasks.workunit.client.0.vm04.stdout:{"sequence": 1, "id": "faa31ffb-c097-494f-8f03-d7402fba530a", "message": "Flattening image rbd2/clone1", "refs": {"action": "flatten", "pool_name": "rbd2", "pool_namespace": "", "image_name": "clone1", "image_id": "3399b602bb36"}, "in_progress": true, "progress": 0.03515625} 2026-03-23T18:34:09.057 INFO:tasks.workunit.client.0.vm04.stderr:+ break 2026-03-23T18:34:09.058 INFO:tasks.workunit.client.0.vm04.stderr:++ ceph rbd task list 2026-03-23T18:34:09.618 INFO:tasks.workunit.client.0.vm04.stderr:+ test '[ 2026-03-23T18:34:09.618 INFO:tasks.workunit.client.0.vm04.stderr: { 2026-03-23T18:34:09.618 INFO:tasks.workunit.client.0.vm04.stderr: "id": "faa31ffb-c097-494f-8f03-d7402fba530a", 2026-03-23T18:34:09.618 INFO:tasks.workunit.client.0.vm04.stderr: "in_progress": true, 2026-03-23T18:34:09.618 INFO:tasks.workunit.client.0.vm04.stderr: "message": "Flattening image rbd2/clone1", 2026-03-23T18:34:09.618 INFO:tasks.workunit.client.0.vm04.stderr: "progress": 0.17578125, 2026-03-23T18:34:09.618 INFO:tasks.workunit.client.0.vm04.stderr: "refs": { 2026-03-23T18:34:09.618 INFO:tasks.workunit.client.0.vm04.stderr: "action": "flatten", 2026-03-23T18:34:09.618 INFO:tasks.workunit.client.0.vm04.stderr: "image_id": "3399b602bb36", 2026-03-23T18:34:09.618 INFO:tasks.workunit.client.0.vm04.stderr: "image_name": "clone1", 2026-03-23T18:34:09.618 INFO:tasks.workunit.client.0.vm04.stderr: "pool_name": "rbd2", 2026-03-23T18:34:09.618 INFO:tasks.workunit.client.0.vm04.stderr: "pool_namespace": "" 2026-03-23T18:34:09.618 INFO:tasks.workunit.client.0.vm04.stderr: }, 2026-03-23T18:34:09.618 INFO:tasks.workunit.client.0.vm04.stderr: "sequence": 1 2026-03-23T18:34:09.618 INFO:tasks.workunit.client.0.vm04.stderr: } 2026-03-23T18:34:09.618 INFO:tasks.workunit.client.0.vm04.stderr:]' '!=' '[]' 2026-03-23T18:34:09.618 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in {1..12} 2026-03-23T18:34:09.618 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'parent: ' 2026-03-23T18:34:09.618 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info rbd2/clone1 2026-03-23T18:34:09.717 INFO:tasks.workunit.client.0.vm04.stdout: parent: rbd2/img1@snap 2026-03-23T18:34:09.717 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-23T18:34:19.718 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in {1..12} 2026-03-23T18:34:19.719 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info rbd2/clone1 2026-03-23T18:34:19.719 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'parent: ' 2026-03-23T18:34:19.748 INFO:tasks.workunit.client.0.vm04.stderr:+ break 2026-03-23T18:34:19.748 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info rbd2/clone1 2026-03-23T18:34:19.748 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail grep 'parent: ' 2026-03-23T18:34:19.748 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'parent: ' 2026-03-23T18:34:19.775 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-23T18:34:19.775 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap unprotect rbd2/img1@snap 2026-03-23T18:34:19.805 INFO:tasks.workunit.client.0.vm04.stderr:++ ceph rbd task list 2026-03-23T18:34:20.052 INFO:tasks.workunit.client.0.vm04.stderr:+ test '[]' = '[]' 2026-03-23T18:34:20.052 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool rm rbd2 rbd2 --yes-i-really-really-mean-it 2026-03-23T18:34:21.150 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd2' does not exist 2026-03-23T18:34:21.163 INFO:tasks.workunit.client.0.vm04.stdout:OK 2026-03-23T18:34:21.164 INFO:tasks.workunit.client.0.vm04.stderr:+ echo OK 2026-03-23T18:34:21.169 INFO:teuthology.orchestra.run:Running command with timeout 3600 2026-03-23T18:34:21.171 DEBUG:teuthology.orchestra.run.vm04:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0/tmp 2026-03-23T18:34:21.222 INFO:tasks.workunit:Stopping ['rbd/cli_generic.sh'] on client.0... 2026-03-23T18:34:21.222 DEBUG:teuthology.orchestra.run.vm04:> sudo rm -rf -- /home/ubuntu/cephtest/workunits.list.client.0 /home/ubuntu/cephtest/clone.client.0 2026-03-23T18:34:21.561 DEBUG:teuthology.parallel:result is None 2026-03-23T18:34:21.565 DEBUG:teuthology.orchestra.run.vm04:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0 2026-03-23T18:34:21.572 INFO:tasks.workunit:Deleted dir /home/ubuntu/cephtest/mnt.0/client.0 2026-03-23T18:34:21.572 DEBUG:teuthology.orchestra.run.vm04:> rmdir -- /home/ubuntu/cephtest/mnt.0 2026-03-23T18:34:21.617 INFO:tasks.workunit:Deleted artificial mount point /home/ubuntu/cephtest/mnt.0/client.0 2026-03-23T18:34:21.618 DEBUG:teuthology.run_tasks:Unwinding manager ceph 2026-03-23T18:34:21.649 INFO:tasks.ceph.ceph_manager.ceph:waiting for clean 2026-03-23T18:34:21.649 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph pg dump --format=json 2026-03-23T18:34:21.815 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T18:34:21.817 INFO:teuthology.orchestra.run.vm04.stderr:dumped all 2026-03-23T18:34:21.827 INFO:teuthology.orchestra.run.vm04.stdout:{"pg_ready":true,"pg_map":{"version":2154,"stamp":"2026-03-23T18:34:21.098920+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":4295426622,"num_objects":1035,"num_object_clones":0,"num_object_copies":3096,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1035,"num_whiteouts":0,"num_read":109967,"num_read_kb":6883121,"num_write":55635,"num_write_kb":13124374,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":7,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":57301,"ondisk_log_size":57301,"up":30,"acting":30,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":46,"num_osds":3,"num_per_pool_osds":3,"num_per_pool_omap_osds":3,"kb":283115520,"kb_used":6934008,"kb_used_data":3153800,"kb_used_omap":1655,"kb_used_meta":3778504,"kb_avail":276181512,"statfs":{"total":289910292480,"available":282809868288,"internally_reserved":0,"allocated":3229491200,"data_stored":6446811002,"data_compressed":27529778,"data_compressed_allocated":3221667840,"data_compressed_original":6443335680,"omap_allocated":1695309,"internal_metadata":3869188531},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":847249063,"num_objects":190,"num_object_clones":0,"num_object_copies":582,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":190,"num_whiteouts":0,"num_read":-1822,"num_read_kb":809165,"num_write":-224,"num_write_kb":826774,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":-5,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"10.063854"},"pg_stats":[{"pgid":"2.7","version":"240'2546","reported_seq":5883,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:01.254573+0000","last_change":"2026-03-23T17:30:51.694301+0000","last_active":"2026-03-23T18:34:01.254573+0000","last_peered":"2026-03-23T18:34:01.254573+0000","last_clean":"2026-03-23T18:34:01.254573+0000","last_became_active":"2026-03-23T17:30:49.499940+0000","last_became_peered":"2026-03-23T17:30:49.499940+0000","last_unstale":"2026-03-23T18:34:01.254573+0000","last_undegraded":"2026-03-23T18:34:01.254573+0000","last_fullsized":"2026-03-23T18:34:01.254573+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":2546,"log_dups_size":0,"ondisk_log_size":2546,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T18:00:58.180879+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.000678289,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":6735,"num_read_kb":5408,"num_write":2151,"num_write_kb":4008,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.6","version":"240'3471","reported_seq":7143,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:01.254626+0000","last_change":"2026-03-23T17:30:51.694315+0000","last_active":"2026-03-23T18:34:01.254626+0000","last_peered":"2026-03-23T18:34:01.254626+0000","last_clean":"2026-03-23T18:34:01.254626+0000","last_became_active":"2026-03-23T17:30:49.500491+0000","last_became_peered":"2026-03-23T17:30:49.500491+0000","last_unstale":"2026-03-23T18:34:01.254626+0000","last_undegraded":"2026-03-23T18:34:01.254626+0000","last_fullsized":"2026-03-23T18:34:01.254626+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":3471,"log_dups_size":0,"ondisk_log_size":3471,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T23:06:50.118782+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00057337200000000005,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":7588,"num_read_kb":6322,"num_write":2108,"num_write_kb":3875,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.5","version":"240'2967","reported_seq":7118,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:01.254767+0000","last_change":"2026-03-23T17:30:51.694436+0000","last_active":"2026-03-23T18:34:01.254767+0000","last_peered":"2026-03-23T18:34:01.254767+0000","last_clean":"2026-03-23T18:34:01.254767+0000","last_became_active":"2026-03-23T17:30:49.500052+0000","last_became_peered":"2026-03-23T17:30:49.500052+0000","last_unstale":"2026-03-23T18:34:01.254767+0000","last_undegraded":"2026-03-23T18:34:01.254767+0000","last_fullsized":"2026-03-23T18:34:01.254767+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":2967,"log_dups_size":0,"ondisk_log_size":2967,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T19:02:09.818760+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00060329899999999996,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":9669,"num_read_kb":8128,"num_write":2574,"num_write_kb":4490,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.4","version":"240'5143","reported_seq":11623,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:01.254688+0000","last_change":"2026-03-23T17:30:51.694557+0000","last_active":"2026-03-23T18:34:01.254688+0000","last_peered":"2026-03-23T18:34:01.254688+0000","last_clean":"2026-03-23T18:34:01.254688+0000","last_became_active":"2026-03-23T17:30:49.499862+0000","last_became_peered":"2026-03-23T17:30:49.499862+0000","last_unstale":"2026-03-23T18:34:01.254688+0000","last_undegraded":"2026-03-23T18:34:01.254688+0000","last_fullsized":"2026-03-23T18:34:01.254688+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":5143,"log_dups_size":0,"ondisk_log_size":5143,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T20:57:46.539497+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00064037800000000004,"stat_sum":{"num_bytes":8,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":13592,"num_read_kb":11780,"num_write":3593,"num_write_kb":4723,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"3.3","version":"248'5903","reported_seq":7133,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:14.515407+0000","last_change":"2026-03-23T18:33:44.838580+0000","last_active":"2026-03-23T18:34:14.515407+0000","last_peered":"2026-03-23T18:34:14.515407+0000","last_clean":"2026-03-23T18:34:14.515407+0000","last_became_active":"2026-03-23T17:30:59.775747+0000","last_became_peered":"2026-03-23T17:30:59.775747+0000","last_unstale":"2026-03-23T18:34:14.515407+0000","last_undegraded":"2026-03-23T18:34:14.515407+0000","last_fullsized":"2026-03-23T18:34:14.515407+0000","mapping_epoch":18,"log_start":"0'0","ondisk_log_start":"0'0","created":18,"last_epoch_clean":19,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:58.748337+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:58.748337+0000","last_clean_scrub_stamp":"2026-03-23T17:30:58.748337+0000","objects_scrubbed":0,"log_size":5903,"log_dups_size":0,"ondisk_log_size":5903,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T01:57:47.976409+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.000212728,"stat_sum":{"num_bytes":1065353216,"num_objects":255,"num_object_clones":0,"num_object_copies":765,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":255,"num_whiteouts":0,"num_read":511,"num_read_kb":1613824,"num_write":5939,"num_write_kb":3420754,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,2,1],"acting":[0,2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.2","version":"240'3916","reported_seq":10714,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:01.248058+0000","last_change":"2026-03-23T17:30:51.694479+0000","last_active":"2026-03-23T18:34:01.248058+0000","last_peered":"2026-03-23T18:34:01.248058+0000","last_clean":"2026-03-23T18:34:01.248058+0000","last_became_active":"2026-03-23T17:30:49.499065+0000","last_became_peered":"2026-03-23T17:30:49.499065+0000","last_unstale":"2026-03-23T18:34:01.248058+0000","last_undegraded":"2026-03-23T18:34:01.248058+0000","last_fullsized":"2026-03-23T18:34:01.248058+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":3916,"log_dups_size":0,"ondisk_log_size":3916,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T02:34:59.316767+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.000806078,"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":17873,"num_read_kb":15444,"num_write":4416,"num_write_kb":6280,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,1],"acting":[0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"3.0","version":"248'6283","reported_seq":7589,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:14.369816+0000","last_change":"2026-03-23T18:33:44.838919+0000","last_active":"2026-03-23T18:34:14.369816+0000","last_peered":"2026-03-23T18:34:14.369816+0000","last_clean":"2026-03-23T18:34:14.369816+0000","last_became_active":"2026-03-23T17:30:59.777274+0000","last_became_peered":"2026-03-23T17:30:59.777274+0000","last_unstale":"2026-03-23T18:34:14.369816+0000","last_undegraded":"2026-03-23T18:34:14.369816+0000","last_fullsized":"2026-03-23T18:34:14.369816+0000","mapping_epoch":18,"log_start":"0'0","ondisk_log_start":"0'0","created":18,"last_epoch_clean":19,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:58.748337+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:58.748337+0000","last_clean_scrub_stamp":"2026-03-23T17:30:58.748337+0000","objects_scrubbed":0,"log_size":6283,"log_dups_size":0,"ondisk_log_size":6283,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T23:26:50.313263+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00075581400000000005,"stat_sum":{"num_bytes":981467136,"num_objects":234,"num_object_clones":0,"num_object_copies":702,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":234,"num_whiteouts":0,"num_read":590,"num_read_kb":1814159,"num_write":6318,"num_write_kb":2953208,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2,0],"acting":[1,2,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.1","version":"240'6570","reported_seq":10591,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:02.018950+0000","last_change":"2026-03-23T17:30:51.712676+0000","last_active":"2026-03-23T18:34:02.018950+0000","last_peered":"2026-03-23T18:34:02.018950+0000","last_clean":"2026-03-23T18:34:02.018950+0000","last_became_active":"2026-03-23T17:30:49.499313+0000","last_became_peered":"2026-03-23T17:30:49.499313+0000","last_unstale":"2026-03-23T18:34:02.018950+0000","last_undegraded":"2026-03-23T18:34:02.018950+0000","last_fullsized":"2026-03-23T18:34:02.018950+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":6570,"log_dups_size":0,"ondisk_log_size":6570,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T02:06:03.323628+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00044566499999999998,"stat_sum":{"num_bytes":0,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":31326,"num_read_kb":26121,"num_write":9933,"num_write_kb":11655,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"3.1","version":"248'5790","reported_seq":7029,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:14.438929+0000","last_change":"2026-03-23T18:33:44.838733+0000","last_active":"2026-03-23T18:34:14.438929+0000","last_peered":"2026-03-23T18:34:14.438929+0000","last_clean":"2026-03-23T18:34:14.438929+0000","last_became_active":"2026-03-23T17:30:59.775767+0000","last_became_peered":"2026-03-23T17:30:59.775767+0000","last_unstale":"2026-03-23T18:34:14.438929+0000","last_undegraded":"2026-03-23T18:34:14.438929+0000","last_fullsized":"2026-03-23T18:34:14.438929+0000","mapping_epoch":18,"log_start":"0'0","ondisk_log_start":"0'0","created":18,"last_epoch_clean":19,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:58.748337+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:58.748337+0000","last_clean_scrub_stamp":"2026-03-23T17:30:58.748337+0000","objects_scrubbed":0,"log_size":5790,"log_dups_size":0,"ondisk_log_size":5790,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T20:06:35.812683+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00026309199999999998,"stat_sum":{"num_bytes":1124073472,"num_objects":268,"num_object_clones":0,"num_object_copies":804,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":268,"num_whiteouts":0,"num_read":552,"num_read_kb":1617042,"num_write":5876,"num_write_kb":3343840,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,2,1],"acting":[0,2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.0","version":"240'3582","reported_seq":7844,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:02.019013+0000","last_change":"2026-03-23T17:30:51.712764+0000","last_active":"2026-03-23T18:34:02.019013+0000","last_peered":"2026-03-23T18:34:02.019013+0000","last_clean":"2026-03-23T18:34:02.019013+0000","last_became_active":"2026-03-23T17:30:49.499316+0000","last_became_peered":"2026-03-23T17:30:49.499316+0000","last_unstale":"2026-03-23T18:34:02.019013+0000","last_undegraded":"2026-03-23T18:34:02.019013+0000","last_fullsized":"2026-03-23T18:34:02.019013+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":3582,"log_dups_size":0,"ondisk_log_size":3582,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T17:43:01.349684+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00040956599999999999,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":8802,"num_read_kb":7268,"num_write":2362,"num_write_kb":4390,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"3.2","version":"248'6121","reported_seq":7745,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:14.369716+0000","last_change":"2026-03-23T18:33:44.839097+0000","last_active":"2026-03-23T18:34:14.369716+0000","last_peered":"2026-03-23T18:34:14.369716+0000","last_clean":"2026-03-23T18:34:14.369716+0000","last_became_active":"2026-03-23T17:30:59.775868+0000","last_became_peered":"2026-03-23T17:30:59.775868+0000","last_unstale":"2026-03-23T18:34:14.369716+0000","last_undegraded":"2026-03-23T18:34:14.369716+0000","last_fullsized":"2026-03-23T18:34:14.369716+0000","mapping_epoch":18,"log_start":"0'0","ondisk_log_start":"0'0","created":18,"last_epoch_clean":19,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:58.748337+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:58.748337+0000","last_clean_scrub_stamp":"2026-03-23T17:30:58.748337+0000","objects_scrubbed":0,"log_size":6121,"log_dups_size":0,"ondisk_log_size":6121,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T20:19:50.638326+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00085092399999999997,"stat_sum":{"num_bytes":1124073491,"num_objects":269,"num_object_clones":0,"num_object_copies":807,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":269,"num_whiteouts":0,"num_read":929,"num_read_kb":1747411,"num_write":6165,"num_write_kb":3360461,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2,0],"acting":[1,2,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.3","version":"240'4977","reported_seq":11023,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:01.254701+0000","last_change":"2026-03-23T17:30:51.694574+0000","last_active":"2026-03-23T18:34:01.254701+0000","last_peered":"2026-03-23T18:34:01.254701+0000","last_clean":"2026-03-23T18:34:01.254701+0000","last_became_active":"2026-03-23T17:30:49.498440+0000","last_became_peered":"2026-03-23T17:30:49.498440+0000","last_unstale":"2026-03-23T18:34:01.254701+0000","last_undegraded":"2026-03-23T18:34:01.254701+0000","last_fullsized":"2026-03-23T18:34:01.254701+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":4977,"log_dups_size":0,"ondisk_log_size":4977,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T00:28:04.686913+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00056242200000000003,"stat_sum":{"num_bytes":0,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":11754,"num_read_kb":10177,"num_write":4143,"num_write_kb":6106,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":1,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2],"acting":[1,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"1.0","version":"10'32","reported_seq":559,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:01.254730+0000","last_change":"2026-03-23T17:30:46.635114+0000","last_active":"2026-03-23T18:34:01.254730+0000","last_peered":"2026-03-23T18:34:01.254730+0000","last_clean":"2026-03-23T18:34:01.254730+0000","last_became_active":"2026-03-23T17:30:46.634518+0000","last_became_peered":"2026-03-23T17:30:46.634518+0000","last_unstale":"2026-03-23T18:34:01.254730+0000","last_undegraded":"2026-03-23T18:34:01.254730+0000","last_fullsized":"2026-03-23T18:34:01.254730+0000","mapping_epoch":9,"log_start":"0'0","ondisk_log_start":"0'0","created":9,"last_epoch_clean":10,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:45.468031+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:45.468031+0000","last_clean_scrub_stamp":"2026-03-23T17:30:45.468031+0000","objects_scrubbed":0,"log_size":32,"log_dups_size":0,"ondisk_log_size":32,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T03:27:15.872490+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]}],"pool_stats":[{"poolid":3,"num_pg":4,"stat_sum":{"num_bytes":4294967315,"num_objects":1026,"num_object_clones":0,"num_object_copies":3078,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1026,"num_whiteouts":0,"num_read":2582,"num_read_kb":6792436,"num_write":24298,"num_write_kb":13078263,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":3221237760,"data_stored":6442463232,"data_compressed":27525120,"data_compressed_allocated":3221225472,"data_compressed_original":6442450944,"omap_allocated":0,"internal_metadata":0},"log_size":24097,"ondisk_log_size":24097,"up":12,"acting":12,"num_store_stats":3},{"poolid":2,"num_pg":8,"stat_sum":{"num_bytes":27,"num_objects":7,"num_object_clones":0,"num_object_copies":14,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":7,"num_whiteouts":0,"num_read":107339,"num_read_kb":90648,"num_write":31280,"num_write_kb":45527,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":7,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":16384,"data_stored":54,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1627427,"internal_metadata":0},"log_size":33172,"ondisk_log_size":33172,"up":16,"acting":16,"num_store_stats":3},{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":483328,"data_stored":918560,"data_compressed":4658,"data_compressed_allocated":442368,"data_compressed_original":884736,"omap_allocated":0,"internal_metadata":0},"log_size":32,"ondisk_log_size":32,"up":2,"acting":2,"num_store_stats":2}],"osd_stats":[{"osd":2,"up_from":8,"seq":34359739135,"num_pgs":13,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":2485676,"kb_used_data":1051100,"kb_used_omap":395,"kb_used_meta":1434164,"kb_avail":91886164,"statfs":{"total":96636764160,"available":94091431936,"internally_reserved":0,"allocated":1076326400,"data_stored":2148630699,"data_compressed":9175040,"data_compressed_allocated":1073741824,"data_compressed_original":2147483648,"omap_allocated":405130,"internal_metadata":1468584310},"hb_peers":[0,1],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]},{"osd":1,"up_from":8,"seq":34359739136,"num_pgs":17,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":2197088,"kb_used_data":1051344,"kb_used_omap":745,"kb_used_meta":1144982,"kb_avail":92174752,"statfs":{"total":96636764160,"available":94386946048,"internally_reserved":0,"allocated":1076576256,"data_stored":2149090085,"data_compressed":9177369,"data_compressed_allocated":1073963008,"data_compressed_original":2147926016,"omap_allocated":763282,"internal_metadata":1172462190},"hb_peers":[0,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]},{"osd":0,"up_from":8,"seq":34359739133,"num_pgs":16,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":2251244,"kb_used_data":1051356,"kb_used_omap":514,"kb_used_meta":1199357,"kb_avail":92120596,"statfs":{"total":96636764160,"available":94331490304,"internally_reserved":0,"allocated":1076588544,"data_stored":2149090218,"data_compressed":9177369,"data_compressed_allocated":1073963008,"data_compressed_original":2147926016,"omap_allocated":526897,"internal_metadata":1228142031},"hb_peers":[1,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":241664,"data_stored":459280,"data_compressed":2329,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":241664,"data_stored":459280,"data_compressed":2329,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":27,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":525307,"internal_metadata":0},{"poolid":2,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":27,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":735134,"internal_metadata":0},{"poolid":2,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":366986,"internal_metadata":0},{"poolid":3,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":1073745920,"data_stored":2147487744,"data_compressed":9175040,"data_compressed_allocated":1073741824,"data_compressed_original":2147483648,"omap_allocated":0,"internal_metadata":0},{"poolid":3,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":1073745920,"data_stored":2147487744,"data_compressed":9175040,"data_compressed_allocated":1073741824,"data_compressed_original":2147483648,"omap_allocated":0,"internal_metadata":0},{"poolid":3,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":1073745920,"data_stored":2147487744,"data_compressed":9175040,"data_compressed_allocated":1073741824,"data_compressed_original":2147483648,"omap_allocated":0,"internal_metadata":0}]}} 2026-03-23T18:34:21.829 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph pg dump --format=json 2026-03-23T18:34:21.982 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T18:34:21.982 INFO:teuthology.orchestra.run.vm04.stderr:dumped all 2026-03-23T18:34:21.994 INFO:teuthology.orchestra.run.vm04.stdout:{"pg_ready":true,"pg_map":{"version":2154,"stamp":"2026-03-23T18:34:21.098920+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":4295426622,"num_objects":1035,"num_object_clones":0,"num_object_copies":3096,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1035,"num_whiteouts":0,"num_read":109967,"num_read_kb":6883121,"num_write":55635,"num_write_kb":13124374,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":7,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":57301,"ondisk_log_size":57301,"up":30,"acting":30,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":46,"num_osds":3,"num_per_pool_osds":3,"num_per_pool_omap_osds":3,"kb":283115520,"kb_used":6934008,"kb_used_data":3153800,"kb_used_omap":1655,"kb_used_meta":3778504,"kb_avail":276181512,"statfs":{"total":289910292480,"available":282809868288,"internally_reserved":0,"allocated":3229491200,"data_stored":6446811002,"data_compressed":27529778,"data_compressed_allocated":3221667840,"data_compressed_original":6443335680,"omap_allocated":1695309,"internal_metadata":3869188531},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":847249063,"num_objects":190,"num_object_clones":0,"num_object_copies":582,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":190,"num_whiteouts":0,"num_read":-1822,"num_read_kb":809165,"num_write":-224,"num_write_kb":826774,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":-5,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"10.063854"},"pg_stats":[{"pgid":"2.7","version":"240'2546","reported_seq":5883,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:01.254573+0000","last_change":"2026-03-23T17:30:51.694301+0000","last_active":"2026-03-23T18:34:01.254573+0000","last_peered":"2026-03-23T18:34:01.254573+0000","last_clean":"2026-03-23T18:34:01.254573+0000","last_became_active":"2026-03-23T17:30:49.499940+0000","last_became_peered":"2026-03-23T17:30:49.499940+0000","last_unstale":"2026-03-23T18:34:01.254573+0000","last_undegraded":"2026-03-23T18:34:01.254573+0000","last_fullsized":"2026-03-23T18:34:01.254573+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":2546,"log_dups_size":0,"ondisk_log_size":2546,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T18:00:58.180879+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.000678289,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":6735,"num_read_kb":5408,"num_write":2151,"num_write_kb":4008,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.6","version":"240'3471","reported_seq":7143,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:01.254626+0000","last_change":"2026-03-23T17:30:51.694315+0000","last_active":"2026-03-23T18:34:01.254626+0000","last_peered":"2026-03-23T18:34:01.254626+0000","last_clean":"2026-03-23T18:34:01.254626+0000","last_became_active":"2026-03-23T17:30:49.500491+0000","last_became_peered":"2026-03-23T17:30:49.500491+0000","last_unstale":"2026-03-23T18:34:01.254626+0000","last_undegraded":"2026-03-23T18:34:01.254626+0000","last_fullsized":"2026-03-23T18:34:01.254626+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":3471,"log_dups_size":0,"ondisk_log_size":3471,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T23:06:50.118782+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00057337200000000005,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":7588,"num_read_kb":6322,"num_write":2108,"num_write_kb":3875,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.5","version":"240'2967","reported_seq":7118,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:01.254767+0000","last_change":"2026-03-23T17:30:51.694436+0000","last_active":"2026-03-23T18:34:01.254767+0000","last_peered":"2026-03-23T18:34:01.254767+0000","last_clean":"2026-03-23T18:34:01.254767+0000","last_became_active":"2026-03-23T17:30:49.500052+0000","last_became_peered":"2026-03-23T17:30:49.500052+0000","last_unstale":"2026-03-23T18:34:01.254767+0000","last_undegraded":"2026-03-23T18:34:01.254767+0000","last_fullsized":"2026-03-23T18:34:01.254767+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":2967,"log_dups_size":0,"ondisk_log_size":2967,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T19:02:09.818760+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00060329899999999996,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":9669,"num_read_kb":8128,"num_write":2574,"num_write_kb":4490,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.4","version":"240'5143","reported_seq":11623,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:01.254688+0000","last_change":"2026-03-23T17:30:51.694557+0000","last_active":"2026-03-23T18:34:01.254688+0000","last_peered":"2026-03-23T18:34:01.254688+0000","last_clean":"2026-03-23T18:34:01.254688+0000","last_became_active":"2026-03-23T17:30:49.499862+0000","last_became_peered":"2026-03-23T17:30:49.499862+0000","last_unstale":"2026-03-23T18:34:01.254688+0000","last_undegraded":"2026-03-23T18:34:01.254688+0000","last_fullsized":"2026-03-23T18:34:01.254688+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":5143,"log_dups_size":0,"ondisk_log_size":5143,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T20:57:46.539497+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00064037800000000004,"stat_sum":{"num_bytes":8,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":13592,"num_read_kb":11780,"num_write":3593,"num_write_kb":4723,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"3.3","version":"248'5903","reported_seq":7133,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:14.515407+0000","last_change":"2026-03-23T18:33:44.838580+0000","last_active":"2026-03-23T18:34:14.515407+0000","last_peered":"2026-03-23T18:34:14.515407+0000","last_clean":"2026-03-23T18:34:14.515407+0000","last_became_active":"2026-03-23T17:30:59.775747+0000","last_became_peered":"2026-03-23T17:30:59.775747+0000","last_unstale":"2026-03-23T18:34:14.515407+0000","last_undegraded":"2026-03-23T18:34:14.515407+0000","last_fullsized":"2026-03-23T18:34:14.515407+0000","mapping_epoch":18,"log_start":"0'0","ondisk_log_start":"0'0","created":18,"last_epoch_clean":19,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:58.748337+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:58.748337+0000","last_clean_scrub_stamp":"2026-03-23T17:30:58.748337+0000","objects_scrubbed":0,"log_size":5903,"log_dups_size":0,"ondisk_log_size":5903,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T01:57:47.976409+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.000212728,"stat_sum":{"num_bytes":1065353216,"num_objects":255,"num_object_clones":0,"num_object_copies":765,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":255,"num_whiteouts":0,"num_read":511,"num_read_kb":1613824,"num_write":5939,"num_write_kb":3420754,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,2,1],"acting":[0,2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.2","version":"240'3916","reported_seq":10714,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:01.248058+0000","last_change":"2026-03-23T17:30:51.694479+0000","last_active":"2026-03-23T18:34:01.248058+0000","last_peered":"2026-03-23T18:34:01.248058+0000","last_clean":"2026-03-23T18:34:01.248058+0000","last_became_active":"2026-03-23T17:30:49.499065+0000","last_became_peered":"2026-03-23T17:30:49.499065+0000","last_unstale":"2026-03-23T18:34:01.248058+0000","last_undegraded":"2026-03-23T18:34:01.248058+0000","last_fullsized":"2026-03-23T18:34:01.248058+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":3916,"log_dups_size":0,"ondisk_log_size":3916,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T02:34:59.316767+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.000806078,"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":17873,"num_read_kb":15444,"num_write":4416,"num_write_kb":6280,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,1],"acting":[0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"3.0","version":"248'6283","reported_seq":7589,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:14.369816+0000","last_change":"2026-03-23T18:33:44.838919+0000","last_active":"2026-03-23T18:34:14.369816+0000","last_peered":"2026-03-23T18:34:14.369816+0000","last_clean":"2026-03-23T18:34:14.369816+0000","last_became_active":"2026-03-23T17:30:59.777274+0000","last_became_peered":"2026-03-23T17:30:59.777274+0000","last_unstale":"2026-03-23T18:34:14.369816+0000","last_undegraded":"2026-03-23T18:34:14.369816+0000","last_fullsized":"2026-03-23T18:34:14.369816+0000","mapping_epoch":18,"log_start":"0'0","ondisk_log_start":"0'0","created":18,"last_epoch_clean":19,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:58.748337+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:58.748337+0000","last_clean_scrub_stamp":"2026-03-23T17:30:58.748337+0000","objects_scrubbed":0,"log_size":6283,"log_dups_size":0,"ondisk_log_size":6283,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T23:26:50.313263+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00075581400000000005,"stat_sum":{"num_bytes":981467136,"num_objects":234,"num_object_clones":0,"num_object_copies":702,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":234,"num_whiteouts":0,"num_read":590,"num_read_kb":1814159,"num_write":6318,"num_write_kb":2953208,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2,0],"acting":[1,2,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.1","version":"240'6570","reported_seq":10591,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:02.018950+0000","last_change":"2026-03-23T17:30:51.712676+0000","last_active":"2026-03-23T18:34:02.018950+0000","last_peered":"2026-03-23T18:34:02.018950+0000","last_clean":"2026-03-23T18:34:02.018950+0000","last_became_active":"2026-03-23T17:30:49.499313+0000","last_became_peered":"2026-03-23T17:30:49.499313+0000","last_unstale":"2026-03-23T18:34:02.018950+0000","last_undegraded":"2026-03-23T18:34:02.018950+0000","last_fullsized":"2026-03-23T18:34:02.018950+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":6570,"log_dups_size":0,"ondisk_log_size":6570,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T02:06:03.323628+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00044566499999999998,"stat_sum":{"num_bytes":0,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":31326,"num_read_kb":26121,"num_write":9933,"num_write_kb":11655,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"3.1","version":"248'5790","reported_seq":7029,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:14.438929+0000","last_change":"2026-03-23T18:33:44.838733+0000","last_active":"2026-03-23T18:34:14.438929+0000","last_peered":"2026-03-23T18:34:14.438929+0000","last_clean":"2026-03-23T18:34:14.438929+0000","last_became_active":"2026-03-23T17:30:59.775767+0000","last_became_peered":"2026-03-23T17:30:59.775767+0000","last_unstale":"2026-03-23T18:34:14.438929+0000","last_undegraded":"2026-03-23T18:34:14.438929+0000","last_fullsized":"2026-03-23T18:34:14.438929+0000","mapping_epoch":18,"log_start":"0'0","ondisk_log_start":"0'0","created":18,"last_epoch_clean":19,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:58.748337+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:58.748337+0000","last_clean_scrub_stamp":"2026-03-23T17:30:58.748337+0000","objects_scrubbed":0,"log_size":5790,"log_dups_size":0,"ondisk_log_size":5790,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T20:06:35.812683+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00026309199999999998,"stat_sum":{"num_bytes":1124073472,"num_objects":268,"num_object_clones":0,"num_object_copies":804,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":268,"num_whiteouts":0,"num_read":552,"num_read_kb":1617042,"num_write":5876,"num_write_kb":3343840,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,2,1],"acting":[0,2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.0","version":"240'3582","reported_seq":7844,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:02.019013+0000","last_change":"2026-03-23T17:30:51.712764+0000","last_active":"2026-03-23T18:34:02.019013+0000","last_peered":"2026-03-23T18:34:02.019013+0000","last_clean":"2026-03-23T18:34:02.019013+0000","last_became_active":"2026-03-23T17:30:49.499316+0000","last_became_peered":"2026-03-23T17:30:49.499316+0000","last_unstale":"2026-03-23T18:34:02.019013+0000","last_undegraded":"2026-03-23T18:34:02.019013+0000","last_fullsized":"2026-03-23T18:34:02.019013+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":3582,"log_dups_size":0,"ondisk_log_size":3582,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T17:43:01.349684+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00040956599999999999,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":8802,"num_read_kb":7268,"num_write":2362,"num_write_kb":4390,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"3.2","version":"248'6121","reported_seq":7745,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:14.369716+0000","last_change":"2026-03-23T18:33:44.839097+0000","last_active":"2026-03-23T18:34:14.369716+0000","last_peered":"2026-03-23T18:34:14.369716+0000","last_clean":"2026-03-23T18:34:14.369716+0000","last_became_active":"2026-03-23T17:30:59.775868+0000","last_became_peered":"2026-03-23T17:30:59.775868+0000","last_unstale":"2026-03-23T18:34:14.369716+0000","last_undegraded":"2026-03-23T18:34:14.369716+0000","last_fullsized":"2026-03-23T18:34:14.369716+0000","mapping_epoch":18,"log_start":"0'0","ondisk_log_start":"0'0","created":18,"last_epoch_clean":19,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:58.748337+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:58.748337+0000","last_clean_scrub_stamp":"2026-03-23T17:30:58.748337+0000","objects_scrubbed":0,"log_size":6121,"log_dups_size":0,"ondisk_log_size":6121,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T20:19:50.638326+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00085092399999999997,"stat_sum":{"num_bytes":1124073491,"num_objects":269,"num_object_clones":0,"num_object_copies":807,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":269,"num_whiteouts":0,"num_read":929,"num_read_kb":1747411,"num_write":6165,"num_write_kb":3360461,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2,0],"acting":[1,2,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.3","version":"240'4977","reported_seq":11023,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:01.254701+0000","last_change":"2026-03-23T17:30:51.694574+0000","last_active":"2026-03-23T18:34:01.254701+0000","last_peered":"2026-03-23T18:34:01.254701+0000","last_clean":"2026-03-23T18:34:01.254701+0000","last_became_active":"2026-03-23T17:30:49.498440+0000","last_became_peered":"2026-03-23T17:30:49.498440+0000","last_unstale":"2026-03-23T18:34:01.254701+0000","last_undegraded":"2026-03-23T18:34:01.254701+0000","last_fullsized":"2026-03-23T18:34:01.254701+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":4977,"log_dups_size":0,"ondisk_log_size":4977,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T00:28:04.686913+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00056242200000000003,"stat_sum":{"num_bytes":0,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":11754,"num_read_kb":10177,"num_write":4143,"num_write_kb":6106,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":1,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2],"acting":[1,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"1.0","version":"10'32","reported_seq":559,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:01.254730+0000","last_change":"2026-03-23T17:30:46.635114+0000","last_active":"2026-03-23T18:34:01.254730+0000","last_peered":"2026-03-23T18:34:01.254730+0000","last_clean":"2026-03-23T18:34:01.254730+0000","last_became_active":"2026-03-23T17:30:46.634518+0000","last_became_peered":"2026-03-23T17:30:46.634518+0000","last_unstale":"2026-03-23T18:34:01.254730+0000","last_undegraded":"2026-03-23T18:34:01.254730+0000","last_fullsized":"2026-03-23T18:34:01.254730+0000","mapping_epoch":9,"log_start":"0'0","ondisk_log_start":"0'0","created":9,"last_epoch_clean":10,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:45.468031+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:45.468031+0000","last_clean_scrub_stamp":"2026-03-23T17:30:45.468031+0000","objects_scrubbed":0,"log_size":32,"log_dups_size":0,"ondisk_log_size":32,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T03:27:15.872490+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]}],"pool_stats":[{"poolid":3,"num_pg":4,"stat_sum":{"num_bytes":4294967315,"num_objects":1026,"num_object_clones":0,"num_object_copies":3078,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1026,"num_whiteouts":0,"num_read":2582,"num_read_kb":6792436,"num_write":24298,"num_write_kb":13078263,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":3221237760,"data_stored":6442463232,"data_compressed":27525120,"data_compressed_allocated":3221225472,"data_compressed_original":6442450944,"omap_allocated":0,"internal_metadata":0},"log_size":24097,"ondisk_log_size":24097,"up":12,"acting":12,"num_store_stats":3},{"poolid":2,"num_pg":8,"stat_sum":{"num_bytes":27,"num_objects":7,"num_object_clones":0,"num_object_copies":14,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":7,"num_whiteouts":0,"num_read":107339,"num_read_kb":90648,"num_write":31280,"num_write_kb":45527,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":7,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":16384,"data_stored":54,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1627427,"internal_metadata":0},"log_size":33172,"ondisk_log_size":33172,"up":16,"acting":16,"num_store_stats":3},{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":483328,"data_stored":918560,"data_compressed":4658,"data_compressed_allocated":442368,"data_compressed_original":884736,"omap_allocated":0,"internal_metadata":0},"log_size":32,"ondisk_log_size":32,"up":2,"acting":2,"num_store_stats":2}],"osd_stats":[{"osd":2,"up_from":8,"seq":34359739135,"num_pgs":13,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":2485676,"kb_used_data":1051100,"kb_used_omap":395,"kb_used_meta":1434164,"kb_avail":91886164,"statfs":{"total":96636764160,"available":94091431936,"internally_reserved":0,"allocated":1076326400,"data_stored":2148630699,"data_compressed":9175040,"data_compressed_allocated":1073741824,"data_compressed_original":2147483648,"omap_allocated":405130,"internal_metadata":1468584310},"hb_peers":[0,1],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]},{"osd":1,"up_from":8,"seq":34359739136,"num_pgs":17,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":2197088,"kb_used_data":1051344,"kb_used_omap":745,"kb_used_meta":1144982,"kb_avail":92174752,"statfs":{"total":96636764160,"available":94386946048,"internally_reserved":0,"allocated":1076576256,"data_stored":2149090085,"data_compressed":9177369,"data_compressed_allocated":1073963008,"data_compressed_original":2147926016,"omap_allocated":763282,"internal_metadata":1172462190},"hb_peers":[0,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]},{"osd":0,"up_from":8,"seq":34359739133,"num_pgs":16,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":2251244,"kb_used_data":1051356,"kb_used_omap":514,"kb_used_meta":1199357,"kb_avail":92120596,"statfs":{"total":96636764160,"available":94331490304,"internally_reserved":0,"allocated":1076588544,"data_stored":2149090218,"data_compressed":9177369,"data_compressed_allocated":1073963008,"data_compressed_original":2147926016,"omap_allocated":526897,"internal_metadata":1228142031},"hb_peers":[1,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":241664,"data_stored":459280,"data_compressed":2329,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":241664,"data_stored":459280,"data_compressed":2329,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":27,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":525307,"internal_metadata":0},{"poolid":2,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":27,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":735134,"internal_metadata":0},{"poolid":2,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":366986,"internal_metadata":0},{"poolid":3,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":1073745920,"data_stored":2147487744,"data_compressed":9175040,"data_compressed_allocated":1073741824,"data_compressed_original":2147483648,"omap_allocated":0,"internal_metadata":0},{"poolid":3,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":1073745920,"data_stored":2147487744,"data_compressed":9175040,"data_compressed_allocated":1073741824,"data_compressed_original":2147483648,"omap_allocated":0,"internal_metadata":0},{"poolid":3,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":1073745920,"data_stored":2147487744,"data_compressed":9175040,"data_compressed_allocated":1073741824,"data_compressed_original":2147483648,"omap_allocated":0,"internal_metadata":0}]}} 2026-03-23T18:34:21.995 INFO:tasks.ceph.ceph_manager.ceph:clean! 2026-03-23T18:34:21.995 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph pg dump --format=json 2026-03-23T18:34:22.151 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T18:34:22.151 INFO:teuthology.orchestra.run.vm04.stderr:dumped all 2026-03-23T18:34:22.164 INFO:teuthology.orchestra.run.vm04.stdout:{"pg_ready":true,"pg_map":{"version":2154,"stamp":"2026-03-23T18:34:21.098920+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":4295426622,"num_objects":1035,"num_object_clones":0,"num_object_copies":3096,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1035,"num_whiteouts":0,"num_read":109967,"num_read_kb":6883121,"num_write":55635,"num_write_kb":13124374,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":7,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":57301,"ondisk_log_size":57301,"up":30,"acting":30,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":46,"num_osds":3,"num_per_pool_osds":3,"num_per_pool_omap_osds":3,"kb":283115520,"kb_used":6934008,"kb_used_data":3153800,"kb_used_omap":1655,"kb_used_meta":3778504,"kb_avail":276181512,"statfs":{"total":289910292480,"available":282809868288,"internally_reserved":0,"allocated":3229491200,"data_stored":6446811002,"data_compressed":27529778,"data_compressed_allocated":3221667840,"data_compressed_original":6443335680,"omap_allocated":1695309,"internal_metadata":3869188531},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":847249063,"num_objects":190,"num_object_clones":0,"num_object_copies":582,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":190,"num_whiteouts":0,"num_read":-1822,"num_read_kb":809165,"num_write":-224,"num_write_kb":826774,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":-5,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"10.063854"},"pg_stats":[{"pgid":"2.7","version":"240'2546","reported_seq":5883,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:01.254573+0000","last_change":"2026-03-23T17:30:51.694301+0000","last_active":"2026-03-23T18:34:01.254573+0000","last_peered":"2026-03-23T18:34:01.254573+0000","last_clean":"2026-03-23T18:34:01.254573+0000","last_became_active":"2026-03-23T17:30:49.499940+0000","last_became_peered":"2026-03-23T17:30:49.499940+0000","last_unstale":"2026-03-23T18:34:01.254573+0000","last_undegraded":"2026-03-23T18:34:01.254573+0000","last_fullsized":"2026-03-23T18:34:01.254573+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":2546,"log_dups_size":0,"ondisk_log_size":2546,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T18:00:58.180879+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.000678289,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":6735,"num_read_kb":5408,"num_write":2151,"num_write_kb":4008,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.6","version":"240'3471","reported_seq":7143,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:01.254626+0000","last_change":"2026-03-23T17:30:51.694315+0000","last_active":"2026-03-23T18:34:01.254626+0000","last_peered":"2026-03-23T18:34:01.254626+0000","last_clean":"2026-03-23T18:34:01.254626+0000","last_became_active":"2026-03-23T17:30:49.500491+0000","last_became_peered":"2026-03-23T17:30:49.500491+0000","last_unstale":"2026-03-23T18:34:01.254626+0000","last_undegraded":"2026-03-23T18:34:01.254626+0000","last_fullsized":"2026-03-23T18:34:01.254626+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":3471,"log_dups_size":0,"ondisk_log_size":3471,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T23:06:50.118782+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00057337200000000005,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":7588,"num_read_kb":6322,"num_write":2108,"num_write_kb":3875,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.5","version":"240'2967","reported_seq":7118,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:01.254767+0000","last_change":"2026-03-23T17:30:51.694436+0000","last_active":"2026-03-23T18:34:01.254767+0000","last_peered":"2026-03-23T18:34:01.254767+0000","last_clean":"2026-03-23T18:34:01.254767+0000","last_became_active":"2026-03-23T17:30:49.500052+0000","last_became_peered":"2026-03-23T17:30:49.500052+0000","last_unstale":"2026-03-23T18:34:01.254767+0000","last_undegraded":"2026-03-23T18:34:01.254767+0000","last_fullsized":"2026-03-23T18:34:01.254767+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":2967,"log_dups_size":0,"ondisk_log_size":2967,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T19:02:09.818760+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00060329899999999996,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":9669,"num_read_kb":8128,"num_write":2574,"num_write_kb":4490,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.4","version":"240'5143","reported_seq":11623,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:01.254688+0000","last_change":"2026-03-23T17:30:51.694557+0000","last_active":"2026-03-23T18:34:01.254688+0000","last_peered":"2026-03-23T18:34:01.254688+0000","last_clean":"2026-03-23T18:34:01.254688+0000","last_became_active":"2026-03-23T17:30:49.499862+0000","last_became_peered":"2026-03-23T17:30:49.499862+0000","last_unstale":"2026-03-23T18:34:01.254688+0000","last_undegraded":"2026-03-23T18:34:01.254688+0000","last_fullsized":"2026-03-23T18:34:01.254688+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":5143,"log_dups_size":0,"ondisk_log_size":5143,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T20:57:46.539497+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00064037800000000004,"stat_sum":{"num_bytes":8,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":13592,"num_read_kb":11780,"num_write":3593,"num_write_kb":4723,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"3.3","version":"248'5903","reported_seq":7133,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:14.515407+0000","last_change":"2026-03-23T18:33:44.838580+0000","last_active":"2026-03-23T18:34:14.515407+0000","last_peered":"2026-03-23T18:34:14.515407+0000","last_clean":"2026-03-23T18:34:14.515407+0000","last_became_active":"2026-03-23T17:30:59.775747+0000","last_became_peered":"2026-03-23T17:30:59.775747+0000","last_unstale":"2026-03-23T18:34:14.515407+0000","last_undegraded":"2026-03-23T18:34:14.515407+0000","last_fullsized":"2026-03-23T18:34:14.515407+0000","mapping_epoch":18,"log_start":"0'0","ondisk_log_start":"0'0","created":18,"last_epoch_clean":19,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:58.748337+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:58.748337+0000","last_clean_scrub_stamp":"2026-03-23T17:30:58.748337+0000","objects_scrubbed":0,"log_size":5903,"log_dups_size":0,"ondisk_log_size":5903,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T01:57:47.976409+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.000212728,"stat_sum":{"num_bytes":1065353216,"num_objects":255,"num_object_clones":0,"num_object_copies":765,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":255,"num_whiteouts":0,"num_read":511,"num_read_kb":1613824,"num_write":5939,"num_write_kb":3420754,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,2,1],"acting":[0,2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.2","version":"240'3916","reported_seq":10714,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:01.248058+0000","last_change":"2026-03-23T17:30:51.694479+0000","last_active":"2026-03-23T18:34:01.248058+0000","last_peered":"2026-03-23T18:34:01.248058+0000","last_clean":"2026-03-23T18:34:01.248058+0000","last_became_active":"2026-03-23T17:30:49.499065+0000","last_became_peered":"2026-03-23T17:30:49.499065+0000","last_unstale":"2026-03-23T18:34:01.248058+0000","last_undegraded":"2026-03-23T18:34:01.248058+0000","last_fullsized":"2026-03-23T18:34:01.248058+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":3916,"log_dups_size":0,"ondisk_log_size":3916,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T02:34:59.316767+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.000806078,"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":17873,"num_read_kb":15444,"num_write":4416,"num_write_kb":6280,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,1],"acting":[0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"3.0","version":"248'6283","reported_seq":7589,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:14.369816+0000","last_change":"2026-03-23T18:33:44.838919+0000","last_active":"2026-03-23T18:34:14.369816+0000","last_peered":"2026-03-23T18:34:14.369816+0000","last_clean":"2026-03-23T18:34:14.369816+0000","last_became_active":"2026-03-23T17:30:59.777274+0000","last_became_peered":"2026-03-23T17:30:59.777274+0000","last_unstale":"2026-03-23T18:34:14.369816+0000","last_undegraded":"2026-03-23T18:34:14.369816+0000","last_fullsized":"2026-03-23T18:34:14.369816+0000","mapping_epoch":18,"log_start":"0'0","ondisk_log_start":"0'0","created":18,"last_epoch_clean":19,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:58.748337+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:58.748337+0000","last_clean_scrub_stamp":"2026-03-23T17:30:58.748337+0000","objects_scrubbed":0,"log_size":6283,"log_dups_size":0,"ondisk_log_size":6283,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T23:26:50.313263+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00075581400000000005,"stat_sum":{"num_bytes":981467136,"num_objects":234,"num_object_clones":0,"num_object_copies":702,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":234,"num_whiteouts":0,"num_read":590,"num_read_kb":1814159,"num_write":6318,"num_write_kb":2953208,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2,0],"acting":[1,2,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.1","version":"240'6570","reported_seq":10591,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:02.018950+0000","last_change":"2026-03-23T17:30:51.712676+0000","last_active":"2026-03-23T18:34:02.018950+0000","last_peered":"2026-03-23T18:34:02.018950+0000","last_clean":"2026-03-23T18:34:02.018950+0000","last_became_active":"2026-03-23T17:30:49.499313+0000","last_became_peered":"2026-03-23T17:30:49.499313+0000","last_unstale":"2026-03-23T18:34:02.018950+0000","last_undegraded":"2026-03-23T18:34:02.018950+0000","last_fullsized":"2026-03-23T18:34:02.018950+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":6570,"log_dups_size":0,"ondisk_log_size":6570,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T02:06:03.323628+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00044566499999999998,"stat_sum":{"num_bytes":0,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":31326,"num_read_kb":26121,"num_write":9933,"num_write_kb":11655,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"3.1","version":"248'5790","reported_seq":7029,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:14.438929+0000","last_change":"2026-03-23T18:33:44.838733+0000","last_active":"2026-03-23T18:34:14.438929+0000","last_peered":"2026-03-23T18:34:14.438929+0000","last_clean":"2026-03-23T18:34:14.438929+0000","last_became_active":"2026-03-23T17:30:59.775767+0000","last_became_peered":"2026-03-23T17:30:59.775767+0000","last_unstale":"2026-03-23T18:34:14.438929+0000","last_undegraded":"2026-03-23T18:34:14.438929+0000","last_fullsized":"2026-03-23T18:34:14.438929+0000","mapping_epoch":18,"log_start":"0'0","ondisk_log_start":"0'0","created":18,"last_epoch_clean":19,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:58.748337+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:58.748337+0000","last_clean_scrub_stamp":"2026-03-23T17:30:58.748337+0000","objects_scrubbed":0,"log_size":5790,"log_dups_size":0,"ondisk_log_size":5790,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T20:06:35.812683+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00026309199999999998,"stat_sum":{"num_bytes":1124073472,"num_objects":268,"num_object_clones":0,"num_object_copies":804,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":268,"num_whiteouts":0,"num_read":552,"num_read_kb":1617042,"num_write":5876,"num_write_kb":3343840,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,2,1],"acting":[0,2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.0","version":"240'3582","reported_seq":7844,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:02.019013+0000","last_change":"2026-03-23T17:30:51.712764+0000","last_active":"2026-03-23T18:34:02.019013+0000","last_peered":"2026-03-23T18:34:02.019013+0000","last_clean":"2026-03-23T18:34:02.019013+0000","last_became_active":"2026-03-23T17:30:49.499316+0000","last_became_peered":"2026-03-23T17:30:49.499316+0000","last_unstale":"2026-03-23T18:34:02.019013+0000","last_undegraded":"2026-03-23T18:34:02.019013+0000","last_fullsized":"2026-03-23T18:34:02.019013+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":3582,"log_dups_size":0,"ondisk_log_size":3582,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T17:43:01.349684+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00040956599999999999,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":8802,"num_read_kb":7268,"num_write":2362,"num_write_kb":4390,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"3.2","version":"248'6121","reported_seq":7745,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:14.369716+0000","last_change":"2026-03-23T18:33:44.839097+0000","last_active":"2026-03-23T18:34:14.369716+0000","last_peered":"2026-03-23T18:34:14.369716+0000","last_clean":"2026-03-23T18:34:14.369716+0000","last_became_active":"2026-03-23T17:30:59.775868+0000","last_became_peered":"2026-03-23T17:30:59.775868+0000","last_unstale":"2026-03-23T18:34:14.369716+0000","last_undegraded":"2026-03-23T18:34:14.369716+0000","last_fullsized":"2026-03-23T18:34:14.369716+0000","mapping_epoch":18,"log_start":"0'0","ondisk_log_start":"0'0","created":18,"last_epoch_clean":19,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:58.748337+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:58.748337+0000","last_clean_scrub_stamp":"2026-03-23T17:30:58.748337+0000","objects_scrubbed":0,"log_size":6121,"log_dups_size":0,"ondisk_log_size":6121,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T20:19:50.638326+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00085092399999999997,"stat_sum":{"num_bytes":1124073491,"num_objects":269,"num_object_clones":0,"num_object_copies":807,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":269,"num_whiteouts":0,"num_read":929,"num_read_kb":1747411,"num_write":6165,"num_write_kb":3360461,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2,0],"acting":[1,2,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.3","version":"240'4977","reported_seq":11023,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:01.254701+0000","last_change":"2026-03-23T17:30:51.694574+0000","last_active":"2026-03-23T18:34:01.254701+0000","last_peered":"2026-03-23T18:34:01.254701+0000","last_clean":"2026-03-23T18:34:01.254701+0000","last_became_active":"2026-03-23T17:30:49.498440+0000","last_became_peered":"2026-03-23T17:30:49.498440+0000","last_unstale":"2026-03-23T18:34:01.254701+0000","last_undegraded":"2026-03-23T18:34:01.254701+0000","last_fullsized":"2026-03-23T18:34:01.254701+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":4977,"log_dups_size":0,"ondisk_log_size":4977,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T00:28:04.686913+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00056242200000000003,"stat_sum":{"num_bytes":0,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":11754,"num_read_kb":10177,"num_write":4143,"num_write_kb":6106,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":1,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2],"acting":[1,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"1.0","version":"10'32","reported_seq":559,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:01.254730+0000","last_change":"2026-03-23T17:30:46.635114+0000","last_active":"2026-03-23T18:34:01.254730+0000","last_peered":"2026-03-23T18:34:01.254730+0000","last_clean":"2026-03-23T18:34:01.254730+0000","last_became_active":"2026-03-23T17:30:46.634518+0000","last_became_peered":"2026-03-23T17:30:46.634518+0000","last_unstale":"2026-03-23T18:34:01.254730+0000","last_undegraded":"2026-03-23T18:34:01.254730+0000","last_fullsized":"2026-03-23T18:34:01.254730+0000","mapping_epoch":9,"log_start":"0'0","ondisk_log_start":"0'0","created":9,"last_epoch_clean":10,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:45.468031+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:45.468031+0000","last_clean_scrub_stamp":"2026-03-23T17:30:45.468031+0000","objects_scrubbed":0,"log_size":32,"log_dups_size":0,"ondisk_log_size":32,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T03:27:15.872490+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]}],"pool_stats":[{"poolid":3,"num_pg":4,"stat_sum":{"num_bytes":4294967315,"num_objects":1026,"num_object_clones":0,"num_object_copies":3078,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1026,"num_whiteouts":0,"num_read":2582,"num_read_kb":6792436,"num_write":24298,"num_write_kb":13078263,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":3221237760,"data_stored":6442463232,"data_compressed":27525120,"data_compressed_allocated":3221225472,"data_compressed_original":6442450944,"omap_allocated":0,"internal_metadata":0},"log_size":24097,"ondisk_log_size":24097,"up":12,"acting":12,"num_store_stats":3},{"poolid":2,"num_pg":8,"stat_sum":{"num_bytes":27,"num_objects":7,"num_object_clones":0,"num_object_copies":14,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":7,"num_whiteouts":0,"num_read":107339,"num_read_kb":90648,"num_write":31280,"num_write_kb":45527,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":7,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":16384,"data_stored":54,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1627427,"internal_metadata":0},"log_size":33172,"ondisk_log_size":33172,"up":16,"acting":16,"num_store_stats":3},{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":483328,"data_stored":918560,"data_compressed":4658,"data_compressed_allocated":442368,"data_compressed_original":884736,"omap_allocated":0,"internal_metadata":0},"log_size":32,"ondisk_log_size":32,"up":2,"acting":2,"num_store_stats":2}],"osd_stats":[{"osd":2,"up_from":8,"seq":34359739135,"num_pgs":13,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":2485676,"kb_used_data":1051100,"kb_used_omap":395,"kb_used_meta":1434164,"kb_avail":91886164,"statfs":{"total":96636764160,"available":94091431936,"internally_reserved":0,"allocated":1076326400,"data_stored":2148630699,"data_compressed":9175040,"data_compressed_allocated":1073741824,"data_compressed_original":2147483648,"omap_allocated":405130,"internal_metadata":1468584310},"hb_peers":[0,1],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]},{"osd":1,"up_from":8,"seq":34359739136,"num_pgs":17,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":2197088,"kb_used_data":1051344,"kb_used_omap":745,"kb_used_meta":1144982,"kb_avail":92174752,"statfs":{"total":96636764160,"available":94386946048,"internally_reserved":0,"allocated":1076576256,"data_stored":2149090085,"data_compressed":9177369,"data_compressed_allocated":1073963008,"data_compressed_original":2147926016,"omap_allocated":763282,"internal_metadata":1172462190},"hb_peers":[0,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]},{"osd":0,"up_from":8,"seq":34359739133,"num_pgs":16,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":2251244,"kb_used_data":1051356,"kb_used_omap":514,"kb_used_meta":1199357,"kb_avail":92120596,"statfs":{"total":96636764160,"available":94331490304,"internally_reserved":0,"allocated":1076588544,"data_stored":2149090218,"data_compressed":9177369,"data_compressed_allocated":1073963008,"data_compressed_original":2147926016,"omap_allocated":526897,"internal_metadata":1228142031},"hb_peers":[1,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":241664,"data_stored":459280,"data_compressed":2329,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":241664,"data_stored":459280,"data_compressed":2329,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":27,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":525307,"internal_metadata":0},{"poolid":2,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":27,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":735134,"internal_metadata":0},{"poolid":2,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":366986,"internal_metadata":0},{"poolid":3,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":1073745920,"data_stored":2147487744,"data_compressed":9175040,"data_compressed_allocated":1073741824,"data_compressed_original":2147483648,"omap_allocated":0,"internal_metadata":0},{"poolid":3,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":1073745920,"data_stored":2147487744,"data_compressed":9175040,"data_compressed_allocated":1073741824,"data_compressed_original":2147483648,"omap_allocated":0,"internal_metadata":0},{"poolid":3,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":1073745920,"data_stored":2147487744,"data_compressed":9175040,"data_compressed_allocated":1073741824,"data_compressed_original":2147483648,"omap_allocated":0,"internal_metadata":0}]}} 2026-03-23T18:34:22.164 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd dump --format=json 2026-03-23T18:34:22.319 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T18:34:22.319 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":249,"fsid":"bd8fd485-7f5e-4dd1-a4fb-2f27337796a2","created":"2026-03-23T17:30:39.647861+0000","modified":"2026-03-23T18:34:21.094049+0000","last_up_change":"2026-03-23T17:30:43.838380+0000","last_in_change":"2026-03-23T17:30:40.456871+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":5,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":18,"max_osd":3,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"tentacle","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-23T17:30:44.476796+0000","flags":1,"flags_names":"hashpspool","type":1,"size":2,"min_size":1,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"is_stretch_pool":false,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"11","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"nonprimary_shards":"{}","options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_type":"Fair distribution","score_acting":2.9900000095367432,"score_stable":2.9900000095367432,"optimal_score":0.67000001668930054,"raw_score_acting":2,"raw_score_stable":2,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}},{"pool":2,"pool_name":"rbd","create_time":"2026-03-23T17:30:48.033244+0000","flags":8193,"flags_names":"hashpspool,selfmanaged_snaps","type":1,"size":2,"min_size":1,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"is_stretch_pool":false,"object_hash":2,"pg_autoscale_mode":"off","pg_num":8,"pg_placement_num":8,"pg_placement_num_target":8,"pg_num_target":8,"pg_num_pending":8,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"15","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":2,"snap_epoch":15,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"nonprimary_shards":"{}","options":{},"application_metadata":{"rbd":{}},"read_balance":{"score_type":"Fair distribution","score_acting":1.8799999952316284,"score_stable":1.8799999952316284,"optimal_score":1,"raw_score_acting":1.8799999952316284,"raw_score_stable":1.8799999952316284,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}},{"pool":3,"pool_name":"datapool","create_time":"2026-03-23T17:30:57.752321+0000","flags":8197,"flags_names":"hashpspool,ec_overwrites,selfmanaged_snaps","type":3,"size":3,"min_size":2,"crush_rule":1,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"is_stretch_pool":false,"object_hash":2,"pg_autoscale_mode":"off","pg_num":4,"pg_placement_num":4,"pg_placement_num_target":4,"pg_num_target":4,"pg_num_pending":4,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"245","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":59,"snap_epoch":245,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"teuthologyprofile","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":8192,"expected_num_objects":0,"fast_read":false,"nonprimary_shards":"{}","options":{},"application_metadata":{"rbd":{}}}],"osds":[{"osd":0,"uuid":"d91b1ac9-820d-4f5b-83d6-7a015ca26296","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":241,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6802","nonce":2398517092},{"type":"v1","addr":"192.168.123.104:6804","nonce":2398517092}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6806","nonce":2398517092},{"type":"v1","addr":"192.168.123.104:6808","nonce":2398517092}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6814","nonce":2398517092},{"type":"v1","addr":"192.168.123.104:6815","nonce":2398517092}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6810","nonce":2398517092},{"type":"v1","addr":"192.168.123.104:6812","nonce":2398517092}]},"public_addr":"192.168.123.104:6804/2398517092","cluster_addr":"192.168.123.104:6808/2398517092","heartbeat_back_addr":"192.168.123.104:6815/2398517092","heartbeat_front_addr":"192.168.123.104:6812/2398517092","state":["exists","up"]},{"osd":1,"uuid":"b630c322-76af-4a70-9b41-7c22a53ab1d6","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":241,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6800","nonce":3728786032},{"type":"v1","addr":"192.168.123.104:6801","nonce":3728786032}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6803","nonce":3728786032},{"type":"v1","addr":"192.168.123.104:6805","nonce":3728786032}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6811","nonce":3728786032},{"type":"v1","addr":"192.168.123.104:6813","nonce":3728786032}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6807","nonce":3728786032},{"type":"v1","addr":"192.168.123.104:6809","nonce":3728786032}]},"public_addr":"192.168.123.104:6801/3728786032","cluster_addr":"192.168.123.104:6805/3728786032","heartbeat_back_addr":"192.168.123.104:6813/3728786032","heartbeat_front_addr":"192.168.123.104:6809/3728786032","state":["exists","up"]},{"osd":2,"uuid":"d8611821-f18a-4353-a7f8-27b1691155ff","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":241,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6816","nonce":2446644018},{"type":"v1","addr":"192.168.123.104:6817","nonce":2446644018}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6818","nonce":2446644018},{"type":"v1","addr":"192.168.123.104:6819","nonce":2446644018}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6822","nonce":2446644018},{"type":"v1","addr":"192.168.123.104:6823","nonce":2446644018}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6820","nonce":2446644018},{"type":"v1","addr":"192.168.123.104:6821","nonce":2446644018}]},"public_addr":"192.168.123.104:6817/2446644018","cluster_addr":"192.168.123.104:6819/2446644018","heartbeat_back_addr":"192.168.123.104:6823/2446644018","heartbeat_front_addr":"192.168.123.104:6821/2446644018","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"2026-03-23T17:30:42.694573+0000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"2026-03-23T17:30:42.645775+0000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"2026-03-23T17:30:42.706624+0000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{"192.168.123.104:0/1511835233":"2026-03-23T19:32:23.044997+0000","192.168.123.104:0/2400433533":"2026-03-23T19:25:09.715273+0000","192.168.123.104:0/660690074":"2026-03-23T19:19:04.236109+0000","192.168.123.104:0/1414081755":"2026-03-23T19:19:03.274418+0000","192.168.123.104:0/4142589643":"2026-03-23T19:18:16.262879+0000","192.168.123.104:0/297303125":"2026-03-23T19:30:03.131768+0000","192.168.123.104:0/3945322331":"2026-03-23T19:19:02.637904+0000","192.168.123.104:0/2794231221":"2026-03-23T19:18:17.285537+0000","192.168.123.104:0/2561882436":"2026-03-23T19:18:15.144717+0000","192.168.123.104:0/623257793":"2026-03-23T19:33:56.951425+0000","192.168.123.104:0/3896463438":"2026-03-23T19:19:01.584352+0000","192.168.123.104:0/821815103":"2026-03-23T19:18:14.699964+0000"},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"isa","technique":"reed_sol_van"},"teuthologyprofile":{"crush-device-class":"","crush-failure-domain":"osd","crush-num-failure-domains":"0","crush-osds-per-failure-domain":"0","crush-root":"default","k":"2","m":"1","plugin":"isa","technique":"reed_sol_van"}},"removed_snaps_queue":[{"pool":4,"snaps":[{"begin":2,"length":1}]},{"pool":5,"snaps":[{"begin":2,"length":1}]},{"pool":7,"snaps":[{"begin":2,"length":1}]},{"pool":15,"snaps":[{"begin":2,"length":1}]},{"pool":16,"snaps":[{"begin":2,"length":1}]}],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-23T18:34:23.332 INFO:tasks.ceph:Scrubbing osd.0 2026-03-23T18:34:23.332 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph tell osd.0 config set osd_debug_deep_scrub_sleep 0 2026-03-23T18:34:23.408 INFO:teuthology.orchestra.run.vm04.stdout:{ 2026-03-23T18:34:23.408 INFO:teuthology.orchestra.run.vm04.stdout: "success": "osd_debug_deep_scrub_sleep = '' (not observed, change may require restart) " 2026-03-23T18:34:23.408 INFO:teuthology.orchestra.run.vm04.stdout:} 2026-03-23T18:34:23.418 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd deep-scrub 0 2026-03-23T18:34:23.572 INFO:teuthology.orchestra.run.vm04.stderr:instructed osd(s) 0 to deep-scrub 2026-03-23T18:34:23.585 INFO:tasks.ceph:Scrubbing osd.1 2026-03-23T18:34:23.585 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph tell osd.1 config set osd_debug_deep_scrub_sleep 0 2026-03-23T18:34:23.658 INFO:teuthology.orchestra.run.vm04.stdout:{ 2026-03-23T18:34:23.658 INFO:teuthology.orchestra.run.vm04.stdout: "success": "osd_debug_deep_scrub_sleep = '' (not observed, change may require restart) " 2026-03-23T18:34:23.658 INFO:teuthology.orchestra.run.vm04.stdout:} 2026-03-23T18:34:23.668 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd deep-scrub 1 2026-03-23T18:34:23.822 INFO:teuthology.orchestra.run.vm04.stderr:instructed osd(s) 1 to deep-scrub 2026-03-23T18:34:23.834 INFO:tasks.ceph:Scrubbing osd.2 2026-03-23T18:34:23.834 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph tell osd.2 config set osd_debug_deep_scrub_sleep 0 2026-03-23T18:34:23.907 INFO:teuthology.orchestra.run.vm04.stdout:{ 2026-03-23T18:34:23.907 INFO:teuthology.orchestra.run.vm04.stdout: "success": "osd_debug_deep_scrub_sleep = '' (not observed, change may require restart) " 2026-03-23T18:34:23.907 INFO:teuthology.orchestra.run.vm04.stdout:} 2026-03-23T18:34:23.917 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd deep-scrub 2 2026-03-23T18:34:24.070 INFO:teuthology.orchestra.run.vm04.stderr:instructed osd(s) 2 to deep-scrub 2026-03-23T18:34:24.082 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph pg dump --format=json 2026-03-23T18:34:24.234 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T18:34:24.234 INFO:teuthology.orchestra.run.vm04.stderr:dumped all 2026-03-23T18:34:24.247 INFO:teuthology.orchestra.run.vm04.stdout:{"pg_ready":true,"pg_map":{"version":2155,"stamp":"2026-03-23T18:34:23.036728+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":4295426622,"num_objects":1035,"num_object_clones":0,"num_object_copies":3096,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1035,"num_whiteouts":0,"num_read":109967,"num_read_kb":6883121,"num_write":55635,"num_write_kb":13124374,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":7,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":57301,"ondisk_log_size":57301,"up":30,"acting":30,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":46,"num_osds":3,"num_per_pool_osds":3,"num_per_pool_omap_osds":3,"kb":283115520,"kb_used":6934008,"kb_used_data":3153800,"kb_used_omap":1655,"kb_used_meta":3778504,"kb_avail":276181512,"statfs":{"total":289910292480,"available":282809868288,"internally_reserved":0,"allocated":3229491200,"data_stored":6446811002,"data_compressed":27529778,"data_compressed_allocated":3221667840,"data_compressed_original":6443335680,"omap_allocated":1695309,"internal_metadata":3869188531},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":847249063,"num_objects":190,"num_object_clones":0,"num_object_copies":582,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":190,"num_whiteouts":0,"num_read":-1822,"num_read_kb":809165,"num_write":-224,"num_write_kb":826774,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":-5,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"10.001412"},"pg_stats":[{"pgid":"2.7","version":"240'2546","reported_seq":5883,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:01.254573+0000","last_change":"2026-03-23T17:30:51.694301+0000","last_active":"2026-03-23T18:34:01.254573+0000","last_peered":"2026-03-23T18:34:01.254573+0000","last_clean":"2026-03-23T18:34:01.254573+0000","last_became_active":"2026-03-23T17:30:49.499940+0000","last_became_peered":"2026-03-23T17:30:49.499940+0000","last_unstale":"2026-03-23T18:34:01.254573+0000","last_undegraded":"2026-03-23T18:34:01.254573+0000","last_fullsized":"2026-03-23T18:34:01.254573+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":2546,"log_dups_size":0,"ondisk_log_size":2546,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T18:00:58.180879+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.000678289,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":6735,"num_read_kb":5408,"num_write":2151,"num_write_kb":4008,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.6","version":"240'3471","reported_seq":7143,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:01.254626+0000","last_change":"2026-03-23T17:30:51.694315+0000","last_active":"2026-03-23T18:34:01.254626+0000","last_peered":"2026-03-23T18:34:01.254626+0000","last_clean":"2026-03-23T18:34:01.254626+0000","last_became_active":"2026-03-23T17:30:49.500491+0000","last_became_peered":"2026-03-23T17:30:49.500491+0000","last_unstale":"2026-03-23T18:34:01.254626+0000","last_undegraded":"2026-03-23T18:34:01.254626+0000","last_fullsized":"2026-03-23T18:34:01.254626+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":3471,"log_dups_size":0,"ondisk_log_size":3471,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T23:06:50.118782+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00057337200000000005,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":7588,"num_read_kb":6322,"num_write":2108,"num_write_kb":3875,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.5","version":"240'2967","reported_seq":7118,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:01.254767+0000","last_change":"2026-03-23T17:30:51.694436+0000","last_active":"2026-03-23T18:34:01.254767+0000","last_peered":"2026-03-23T18:34:01.254767+0000","last_clean":"2026-03-23T18:34:01.254767+0000","last_became_active":"2026-03-23T17:30:49.500052+0000","last_became_peered":"2026-03-23T17:30:49.500052+0000","last_unstale":"2026-03-23T18:34:01.254767+0000","last_undegraded":"2026-03-23T18:34:01.254767+0000","last_fullsized":"2026-03-23T18:34:01.254767+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":2967,"log_dups_size":0,"ondisk_log_size":2967,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T19:02:09.818760+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00060329899999999996,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":9669,"num_read_kb":8128,"num_write":2574,"num_write_kb":4490,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.4","version":"240'5143","reported_seq":11623,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:01.254688+0000","last_change":"2026-03-23T17:30:51.694557+0000","last_active":"2026-03-23T18:34:01.254688+0000","last_peered":"2026-03-23T18:34:01.254688+0000","last_clean":"2026-03-23T18:34:01.254688+0000","last_became_active":"2026-03-23T17:30:49.499862+0000","last_became_peered":"2026-03-23T17:30:49.499862+0000","last_unstale":"2026-03-23T18:34:01.254688+0000","last_undegraded":"2026-03-23T18:34:01.254688+0000","last_fullsized":"2026-03-23T18:34:01.254688+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":5143,"log_dups_size":0,"ondisk_log_size":5143,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T20:57:46.539497+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00064037800000000004,"stat_sum":{"num_bytes":8,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":13592,"num_read_kb":11780,"num_write":3593,"num_write_kb":4723,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"3.3","version":"248'5903","reported_seq":7133,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:14.515407+0000","last_change":"2026-03-23T18:33:44.838580+0000","last_active":"2026-03-23T18:34:14.515407+0000","last_peered":"2026-03-23T18:34:14.515407+0000","last_clean":"2026-03-23T18:34:14.515407+0000","last_became_active":"2026-03-23T17:30:59.775747+0000","last_became_peered":"2026-03-23T17:30:59.775747+0000","last_unstale":"2026-03-23T18:34:14.515407+0000","last_undegraded":"2026-03-23T18:34:14.515407+0000","last_fullsized":"2026-03-23T18:34:14.515407+0000","mapping_epoch":18,"log_start":"0'0","ondisk_log_start":"0'0","created":18,"last_epoch_clean":19,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:58.748337+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:58.748337+0000","last_clean_scrub_stamp":"2026-03-23T17:30:58.748337+0000","objects_scrubbed":0,"log_size":5903,"log_dups_size":0,"ondisk_log_size":5903,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T01:57:47.976409+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.000212728,"stat_sum":{"num_bytes":1065353216,"num_objects":255,"num_object_clones":0,"num_object_copies":765,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":255,"num_whiteouts":0,"num_read":511,"num_read_kb":1613824,"num_write":5939,"num_write_kb":3420754,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,2,1],"acting":[0,2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.2","version":"240'3916","reported_seq":10714,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:01.248058+0000","last_change":"2026-03-23T17:30:51.694479+0000","last_active":"2026-03-23T18:34:01.248058+0000","last_peered":"2026-03-23T18:34:01.248058+0000","last_clean":"2026-03-23T18:34:01.248058+0000","last_became_active":"2026-03-23T17:30:49.499065+0000","last_became_peered":"2026-03-23T17:30:49.499065+0000","last_unstale":"2026-03-23T18:34:01.248058+0000","last_undegraded":"2026-03-23T18:34:01.248058+0000","last_fullsized":"2026-03-23T18:34:01.248058+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":3916,"log_dups_size":0,"ondisk_log_size":3916,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T02:34:59.316767+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.000806078,"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":17873,"num_read_kb":15444,"num_write":4416,"num_write_kb":6280,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,1],"acting":[0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"3.0","version":"248'6283","reported_seq":7589,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:14.369816+0000","last_change":"2026-03-23T18:33:44.838919+0000","last_active":"2026-03-23T18:34:14.369816+0000","last_peered":"2026-03-23T18:34:14.369816+0000","last_clean":"2026-03-23T18:34:14.369816+0000","last_became_active":"2026-03-23T17:30:59.777274+0000","last_became_peered":"2026-03-23T17:30:59.777274+0000","last_unstale":"2026-03-23T18:34:14.369816+0000","last_undegraded":"2026-03-23T18:34:14.369816+0000","last_fullsized":"2026-03-23T18:34:14.369816+0000","mapping_epoch":18,"log_start":"0'0","ondisk_log_start":"0'0","created":18,"last_epoch_clean":19,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:58.748337+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:58.748337+0000","last_clean_scrub_stamp":"2026-03-23T17:30:58.748337+0000","objects_scrubbed":0,"log_size":6283,"log_dups_size":0,"ondisk_log_size":6283,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T23:26:50.313263+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00075581400000000005,"stat_sum":{"num_bytes":981467136,"num_objects":234,"num_object_clones":0,"num_object_copies":702,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":234,"num_whiteouts":0,"num_read":590,"num_read_kb":1814159,"num_write":6318,"num_write_kb":2953208,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2,0],"acting":[1,2,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.1","version":"240'6570","reported_seq":10591,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:02.018950+0000","last_change":"2026-03-23T17:30:51.712676+0000","last_active":"2026-03-23T18:34:02.018950+0000","last_peered":"2026-03-23T18:34:02.018950+0000","last_clean":"2026-03-23T18:34:02.018950+0000","last_became_active":"2026-03-23T17:30:49.499313+0000","last_became_peered":"2026-03-23T17:30:49.499313+0000","last_unstale":"2026-03-23T18:34:02.018950+0000","last_undegraded":"2026-03-23T18:34:02.018950+0000","last_fullsized":"2026-03-23T18:34:02.018950+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":6570,"log_dups_size":0,"ondisk_log_size":6570,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T02:06:03.323628+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00044566499999999998,"stat_sum":{"num_bytes":0,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":31326,"num_read_kb":26121,"num_write":9933,"num_write_kb":11655,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"3.1","version":"248'5790","reported_seq":7029,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:14.438929+0000","last_change":"2026-03-23T18:33:44.838733+0000","last_active":"2026-03-23T18:34:14.438929+0000","last_peered":"2026-03-23T18:34:14.438929+0000","last_clean":"2026-03-23T18:34:14.438929+0000","last_became_active":"2026-03-23T17:30:59.775767+0000","last_became_peered":"2026-03-23T17:30:59.775767+0000","last_unstale":"2026-03-23T18:34:14.438929+0000","last_undegraded":"2026-03-23T18:34:14.438929+0000","last_fullsized":"2026-03-23T18:34:14.438929+0000","mapping_epoch":18,"log_start":"0'0","ondisk_log_start":"0'0","created":18,"last_epoch_clean":19,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:58.748337+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:58.748337+0000","last_clean_scrub_stamp":"2026-03-23T17:30:58.748337+0000","objects_scrubbed":0,"log_size":5790,"log_dups_size":0,"ondisk_log_size":5790,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T20:06:35.812683+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00026309199999999998,"stat_sum":{"num_bytes":1124073472,"num_objects":268,"num_object_clones":0,"num_object_copies":804,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":268,"num_whiteouts":0,"num_read":552,"num_read_kb":1617042,"num_write":5876,"num_write_kb":3343840,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,2,1],"acting":[0,2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.0","version":"240'3582","reported_seq":7844,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:02.019013+0000","last_change":"2026-03-23T17:30:51.712764+0000","last_active":"2026-03-23T18:34:02.019013+0000","last_peered":"2026-03-23T18:34:02.019013+0000","last_clean":"2026-03-23T18:34:02.019013+0000","last_became_active":"2026-03-23T17:30:49.499316+0000","last_became_peered":"2026-03-23T17:30:49.499316+0000","last_unstale":"2026-03-23T18:34:02.019013+0000","last_undegraded":"2026-03-23T18:34:02.019013+0000","last_fullsized":"2026-03-23T18:34:02.019013+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":3582,"log_dups_size":0,"ondisk_log_size":3582,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T17:43:01.349684+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00040956599999999999,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":8802,"num_read_kb":7268,"num_write":2362,"num_write_kb":4390,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"3.2","version":"248'6121","reported_seq":7745,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:14.369716+0000","last_change":"2026-03-23T18:33:44.839097+0000","last_active":"2026-03-23T18:34:14.369716+0000","last_peered":"2026-03-23T18:34:14.369716+0000","last_clean":"2026-03-23T18:34:14.369716+0000","last_became_active":"2026-03-23T17:30:59.775868+0000","last_became_peered":"2026-03-23T17:30:59.775868+0000","last_unstale":"2026-03-23T18:34:14.369716+0000","last_undegraded":"2026-03-23T18:34:14.369716+0000","last_fullsized":"2026-03-23T18:34:14.369716+0000","mapping_epoch":18,"log_start":"0'0","ondisk_log_start":"0'0","created":18,"last_epoch_clean":19,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:58.748337+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:58.748337+0000","last_clean_scrub_stamp":"2026-03-23T17:30:58.748337+0000","objects_scrubbed":0,"log_size":6121,"log_dups_size":0,"ondisk_log_size":6121,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T20:19:50.638326+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00085092399999999997,"stat_sum":{"num_bytes":1124073491,"num_objects":269,"num_object_clones":0,"num_object_copies":807,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":269,"num_whiteouts":0,"num_read":929,"num_read_kb":1747411,"num_write":6165,"num_write_kb":3360461,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2,0],"acting":[1,2,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.3","version":"240'4977","reported_seq":11023,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:01.254701+0000","last_change":"2026-03-23T17:30:51.694574+0000","last_active":"2026-03-23T18:34:01.254701+0000","last_peered":"2026-03-23T18:34:01.254701+0000","last_clean":"2026-03-23T18:34:01.254701+0000","last_became_active":"2026-03-23T17:30:49.498440+0000","last_became_peered":"2026-03-23T17:30:49.498440+0000","last_unstale":"2026-03-23T18:34:01.254701+0000","last_undegraded":"2026-03-23T18:34:01.254701+0000","last_fullsized":"2026-03-23T18:34:01.254701+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:48.482909+0000","last_clean_scrub_stamp":"2026-03-23T17:30:48.482909+0000","objects_scrubbed":0,"log_size":4977,"log_dups_size":0,"ondisk_log_size":4977,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T00:28:04.686913+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00056242200000000003,"stat_sum":{"num_bytes":0,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":11754,"num_read_kb":10177,"num_write":4143,"num_write_kb":6106,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":1,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2],"acting":[1,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"1.0","version":"10'32","reported_seq":559,"reported_epoch":248,"state":"active+clean","last_fresh":"2026-03-23T18:34:01.254730+0000","last_change":"2026-03-23T17:30:46.635114+0000","last_active":"2026-03-23T18:34:01.254730+0000","last_peered":"2026-03-23T18:34:01.254730+0000","last_clean":"2026-03-23T18:34:01.254730+0000","last_became_active":"2026-03-23T17:30:46.634518+0000","last_became_peered":"2026-03-23T17:30:46.634518+0000","last_unstale":"2026-03-23T18:34:01.254730+0000","last_undegraded":"2026-03-23T18:34:01.254730+0000","last_fullsized":"2026-03-23T18:34:01.254730+0000","mapping_epoch":9,"log_start":"0'0","ondisk_log_start":"0'0","created":9,"last_epoch_clean":10,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-23T17:30:45.468031+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-23T17:30:45.468031+0000","last_clean_scrub_stamp":"2026-03-23T17:30:45.468031+0000","objects_scrubbed":0,"log_size":32,"log_dups_size":0,"ondisk_log_size":32,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T03:27:15.872490+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]}],"pool_stats":[{"poolid":3,"num_pg":4,"stat_sum":{"num_bytes":4294967315,"num_objects":1026,"num_object_clones":0,"num_object_copies":3078,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1026,"num_whiteouts":0,"num_read":2582,"num_read_kb":6792436,"num_write":24298,"num_write_kb":13078263,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":3221237760,"data_stored":6442463232,"data_compressed":27525120,"data_compressed_allocated":3221225472,"data_compressed_original":6442450944,"omap_allocated":0,"internal_metadata":0},"log_size":24097,"ondisk_log_size":24097,"up":12,"acting":12,"num_store_stats":3},{"poolid":2,"num_pg":8,"stat_sum":{"num_bytes":27,"num_objects":7,"num_object_clones":0,"num_object_copies":14,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":7,"num_whiteouts":0,"num_read":107339,"num_read_kb":90648,"num_write":31280,"num_write_kb":45527,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":7,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":16384,"data_stored":54,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1627427,"internal_metadata":0},"log_size":33172,"ondisk_log_size":33172,"up":16,"acting":16,"num_store_stats":3},{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":483328,"data_stored":918560,"data_compressed":4658,"data_compressed_allocated":442368,"data_compressed_original":884736,"omap_allocated":0,"internal_metadata":0},"log_size":32,"ondisk_log_size":32,"up":2,"acting":2,"num_store_stats":2}],"osd_stats":[{"osd":2,"up_from":8,"seq":34359739135,"num_pgs":13,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":2485676,"kb_used_data":1051100,"kb_used_omap":395,"kb_used_meta":1434164,"kb_avail":91886164,"statfs":{"total":96636764160,"available":94091431936,"internally_reserved":0,"allocated":1076326400,"data_stored":2148630699,"data_compressed":9175040,"data_compressed_allocated":1073741824,"data_compressed_original":2147483648,"omap_allocated":405130,"internal_metadata":1468584310},"hb_peers":[0,1],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]},{"osd":1,"up_from":8,"seq":34359739136,"num_pgs":17,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":2197088,"kb_used_data":1051344,"kb_used_omap":745,"kb_used_meta":1144982,"kb_avail":92174752,"statfs":{"total":96636764160,"available":94386946048,"internally_reserved":0,"allocated":1076576256,"data_stored":2149090085,"data_compressed":9177369,"data_compressed_allocated":1073963008,"data_compressed_original":2147926016,"omap_allocated":763282,"internal_metadata":1172462190},"hb_peers":[0,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]},{"osd":0,"up_from":8,"seq":34359739133,"num_pgs":16,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":2251244,"kb_used_data":1051356,"kb_used_omap":514,"kb_used_meta":1199357,"kb_avail":92120596,"statfs":{"total":96636764160,"available":94331490304,"internally_reserved":0,"allocated":1076588544,"data_stored":2149090218,"data_compressed":9177369,"data_compressed_allocated":1073963008,"data_compressed_original":2147926016,"omap_allocated":526897,"internal_metadata":1228142031},"hb_peers":[1,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":241664,"data_stored":459280,"data_compressed":2329,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":241664,"data_stored":459280,"data_compressed":2329,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":27,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":525307,"internal_metadata":0},{"poolid":2,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":27,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":735134,"internal_metadata":0},{"poolid":2,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":366986,"internal_metadata":0},{"poolid":3,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":1073745920,"data_stored":2147487744,"data_compressed":9175040,"data_compressed_allocated":1073741824,"data_compressed_original":2147483648,"omap_allocated":0,"internal_metadata":0},{"poolid":3,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":1073745920,"data_stored":2147487744,"data_compressed":9175040,"data_compressed_allocated":1073741824,"data_compressed_original":2147483648,"omap_allocated":0,"internal_metadata":0},{"poolid":3,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":1073745920,"data_stored":2147487744,"data_compressed":9175040,"data_compressed_allocated":1073741824,"data_compressed_original":2147483648,"omap_allocated":0,"internal_metadata":0}]}} 2026-03-23T18:34:24.276 INFO:tasks.ceph:pgid 2.7 last_scrub_stamp 2026-03-23T17:30:48.482909+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=23, tm_hour=17, tm_min=30, tm_sec=48, tm_wday=0, tm_yday=82, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=23, tm_hour=18, tm_min=34, tm_sec=22, tm_wday=0, tm_yday=82, tm_isdst=0) 2026-03-23T18:34:24.276 INFO:tasks.ceph:pgid 2.6 last_scrub_stamp 2026-03-23T17:30:48.482909+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=23, tm_hour=17, tm_min=30, tm_sec=48, tm_wday=0, tm_yday=82, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=23, tm_hour=18, tm_min=34, tm_sec=22, tm_wday=0, tm_yday=82, tm_isdst=0) 2026-03-23T18:34:24.276 INFO:tasks.ceph:pgid 2.5 last_scrub_stamp 2026-03-23T17:30:48.482909+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=23, tm_hour=17, tm_min=30, tm_sec=48, tm_wday=0, tm_yday=82, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=23, tm_hour=18, tm_min=34, tm_sec=22, tm_wday=0, tm_yday=82, tm_isdst=0) 2026-03-23T18:34:24.276 INFO:tasks.ceph:pgid 2.4 last_scrub_stamp 2026-03-23T17:30:48.482909+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=23, tm_hour=17, tm_min=30, tm_sec=48, tm_wday=0, tm_yday=82, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=23, tm_hour=18, tm_min=34, tm_sec=22, tm_wday=0, tm_yday=82, tm_isdst=0) 2026-03-23T18:34:24.276 INFO:tasks.ceph:pgid 3.3 last_scrub_stamp 2026-03-23T17:30:58.748337+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=23, tm_hour=17, tm_min=30, tm_sec=58, tm_wday=0, tm_yday=82, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=23, tm_hour=18, tm_min=34, tm_sec=22, tm_wday=0, tm_yday=82, tm_isdst=0) 2026-03-23T18:34:24.276 INFO:tasks.ceph:pgid 2.2 last_scrub_stamp 2026-03-23T17:30:48.482909+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=23, tm_hour=17, tm_min=30, tm_sec=48, tm_wday=0, tm_yday=82, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=23, tm_hour=18, tm_min=34, tm_sec=22, tm_wday=0, tm_yday=82, tm_isdst=0) 2026-03-23T18:34:24.276 INFO:tasks.ceph:pgid 3.0 last_scrub_stamp 2026-03-23T17:30:58.748337+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=23, tm_hour=17, tm_min=30, tm_sec=58, tm_wday=0, tm_yday=82, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=23, tm_hour=18, tm_min=34, tm_sec=22, tm_wday=0, tm_yday=82, tm_isdst=0) 2026-03-23T18:34:24.276 INFO:tasks.ceph:pgid 2.1 last_scrub_stamp 2026-03-23T17:30:48.482909+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=23, tm_hour=17, tm_min=30, tm_sec=48, tm_wday=0, tm_yday=82, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=23, tm_hour=18, tm_min=34, tm_sec=22, tm_wday=0, tm_yday=82, tm_isdst=0) 2026-03-23T18:34:24.276 INFO:tasks.ceph:pgid 3.1 last_scrub_stamp 2026-03-23T17:30:58.748337+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=23, tm_hour=17, tm_min=30, tm_sec=58, tm_wday=0, tm_yday=82, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=23, tm_hour=18, tm_min=34, tm_sec=22, tm_wday=0, tm_yday=82, tm_isdst=0) 2026-03-23T18:34:24.276 INFO:tasks.ceph:pgid 2.0 last_scrub_stamp 2026-03-23T17:30:48.482909+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=23, tm_hour=17, tm_min=30, tm_sec=48, tm_wday=0, tm_yday=82, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=23, tm_hour=18, tm_min=34, tm_sec=22, tm_wday=0, tm_yday=82, tm_isdst=0) 2026-03-23T18:34:24.276 INFO:tasks.ceph:pgid 3.2 last_scrub_stamp 2026-03-23T17:30:58.748337+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=23, tm_hour=17, tm_min=30, tm_sec=58, tm_wday=0, tm_yday=82, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=23, tm_hour=18, tm_min=34, tm_sec=22, tm_wday=0, tm_yday=82, tm_isdst=0) 2026-03-23T18:34:24.276 INFO:tasks.ceph:pgid 2.3 last_scrub_stamp 2026-03-23T17:30:48.482909+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=23, tm_hour=17, tm_min=30, tm_sec=48, tm_wday=0, tm_yday=82, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=23, tm_hour=18, tm_min=34, tm_sec=22, tm_wday=0, tm_yday=82, tm_isdst=0) 2026-03-23T18:34:24.277 INFO:tasks.ceph:pgid 1.0 last_scrub_stamp 2026-03-23T17:30:45.468031+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=23, tm_hour=17, tm_min=30, tm_sec=45, tm_wday=0, tm_yday=82, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=23, tm_hour=18, tm_min=34, tm_sec=22, tm_wday=0, tm_yday=82, tm_isdst=0) 2026-03-23T18:34:24.277 INFO:tasks.ceph:Still waiting for all pgs to be scrubbed. 2026-03-23T18:34:44.277 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph pg dump --format=json 2026-03-23T18:34:44.429 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-23T18:34:44.429 INFO:teuthology.orchestra.run.vm04.stderr:dumped all 2026-03-23T18:34:44.441 INFO:teuthology.orchestra.run.vm04.stdout:{"pg_ready":true,"pg_map":{"version":2165,"stamp":"2026-03-23T18:34:43.039033+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":4295426622,"num_objects":1035,"num_object_clones":0,"num_object_copies":3096,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1035,"num_whiteouts":0,"num_read":109967,"num_read_kb":6883121,"num_write":55635,"num_write_kb":13124374,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":7,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":57310,"ondisk_log_size":57310,"up":30,"acting":30,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":30,"num_osds":3,"num_per_pool_osds":3,"num_per_pool_omap_osds":3,"kb":283115520,"kb_used":6934008,"kb_used_data":3153800,"kb_used_omap":1659,"kb_used_meta":3778500,"kb_avail":276181512,"statfs":{"total":289910292480,"available":282809868288,"internally_reserved":0,"allocated":3229491200,"data_stored":6446824901,"data_compressed":27529778,"data_compressed_allocated":3221667840,"data_compressed_original":6443335680,"omap_allocated":1699486,"internal_metadata":3869184354},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"12.001404"},"pg_stats":[{"pgid":"2.7","version":"240'2546","reported_seq":5893,"reported_epoch":249,"state":"active+clean","last_fresh":"2026-03-23T18:34:29.403141+0000","last_change":"2026-03-23T18:34:29.403141+0000","last_active":"2026-03-23T18:34:29.403141+0000","last_peered":"2026-03-23T18:34:29.403141+0000","last_clean":"2026-03-23T18:34:29.403141+0000","last_became_active":"2026-03-23T17:30:49.499940+0000","last_became_peered":"2026-03-23T17:30:49.499940+0000","last_unstale":"2026-03-23T18:34:29.403141+0000","last_undegraded":"2026-03-23T18:34:29.403141+0000","last_fullsized":"2026-03-23T18:34:29.403141+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"240'2546","last_scrub_stamp":"2026-03-23T18:34:29.403089+0000","last_deep_scrub":"240'2546","last_deep_scrub_stamp":"2026-03-23T18:34:29.403089+0000","last_clean_scrub_stamp":"2026-03-23T18:34:29.403089+0000","objects_scrubbed":0,"log_size":2546,"log_dups_size":0,"ondisk_log_size":2546,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T22:19:37.896471+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.000678289,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":6735,"num_read_kb":5408,"num_write":2151,"num_write_kb":4008,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.6","version":"240'3471","reported_seq":7153,"reported_epoch":249,"state":"active+clean","last_fresh":"2026-03-23T18:34:28.371340+0000","last_change":"2026-03-23T18:34:28.371340+0000","last_active":"2026-03-23T18:34:28.371340+0000","last_peered":"2026-03-23T18:34:28.371340+0000","last_clean":"2026-03-23T18:34:28.371340+0000","last_became_active":"2026-03-23T17:30:49.500491+0000","last_became_peered":"2026-03-23T17:30:49.500491+0000","last_unstale":"2026-03-23T18:34:28.371340+0000","last_undegraded":"2026-03-23T18:34:28.371340+0000","last_fullsized":"2026-03-23T18:34:28.371340+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"240'3471","last_scrub_stamp":"2026-03-23T18:34:28.371290+0000","last_deep_scrub":"240'3471","last_deep_scrub_stamp":"2026-03-23T18:34:28.371290+0000","last_clean_scrub_stamp":"2026-03-23T18:34:28.371290+0000","objects_scrubbed":0,"log_size":3471,"log_dups_size":0,"ondisk_log_size":3471,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T21:41:16.815670+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00057337200000000005,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":7588,"num_read_kb":6322,"num_write":2108,"num_write_kb":3875,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.5","version":"240'2967","reported_seq":7128,"reported_epoch":249,"state":"active+clean","last_fresh":"2026-03-23T18:34:26.336577+0000","last_change":"2026-03-23T18:34:26.336577+0000","last_active":"2026-03-23T18:34:26.336577+0000","last_peered":"2026-03-23T18:34:26.336577+0000","last_clean":"2026-03-23T18:34:26.336577+0000","last_became_active":"2026-03-23T17:30:49.500052+0000","last_became_peered":"2026-03-23T17:30:49.500052+0000","last_unstale":"2026-03-23T18:34:26.336577+0000","last_undegraded":"2026-03-23T18:34:26.336577+0000","last_fullsized":"2026-03-23T18:34:26.336577+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"240'2967","last_scrub_stamp":"2026-03-23T18:34:26.336530+0000","last_deep_scrub":"240'2967","last_deep_scrub_stamp":"2026-03-23T18:34:26.336530+0000","last_clean_scrub_stamp":"2026-03-23T18:34:26.336530+0000","objects_scrubbed":0,"log_size":2967,"log_dups_size":0,"ondisk_log_size":2967,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T01:25:08.575562+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00060329899999999996,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":9669,"num_read_kb":8128,"num_write":2574,"num_write_kb":4490,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.4","version":"249'5145","reported_seq":11635,"reported_epoch":249,"state":"active+clean","last_fresh":"2026-03-23T18:34:25.291668+0000","last_change":"2026-03-23T18:34:25.291668+0000","last_active":"2026-03-23T18:34:25.291668+0000","last_peered":"2026-03-23T18:34:25.291668+0000","last_clean":"2026-03-23T18:34:25.291668+0000","last_became_active":"2026-03-23T17:30:49.499862+0000","last_became_peered":"2026-03-23T17:30:49.499862+0000","last_unstale":"2026-03-23T18:34:25.291668+0000","last_undegraded":"2026-03-23T18:34:25.291668+0000","last_fullsized":"2026-03-23T18:34:25.291668+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"249'5145","last_scrub_stamp":"2026-03-23T18:34:25.291627+0000","last_deep_scrub":"249'5145","last_deep_scrub_stamp":"2026-03-23T18:34:25.291627+0000","last_clean_scrub_stamp":"2026-03-23T18:34:25.291627+0000","objects_scrubbed":2,"log_size":5145,"log_dups_size":0,"ondisk_log_size":5145,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":1,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T02:48:58.887158+0000","scrub_duration":9,"objects_trimmed":0,"snaptrim_duration":0.00064037800000000004,"stat_sum":{"num_bytes":8,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":13592,"num_read_kb":11780,"num_write":3593,"num_write_kb":4723,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"3.3","version":"248'5903","reported_seq":7160,"reported_epoch":249,"state":"active+clean","last_fresh":"2026-03-23T18:34:27.202941+0000","last_change":"2026-03-23T18:34:27.202941+0000","last_active":"2026-03-23T18:34:27.202941+0000","last_peered":"2026-03-23T18:34:27.202941+0000","last_clean":"2026-03-23T18:34:27.202941+0000","last_became_active":"2026-03-23T17:30:59.775747+0000","last_became_peered":"2026-03-23T17:30:59.775747+0000","last_unstale":"2026-03-23T18:34:27.202941+0000","last_undegraded":"2026-03-23T18:34:27.202941+0000","last_fullsized":"2026-03-23T18:34:27.202941+0000","mapping_epoch":18,"log_start":"0'0","ondisk_log_start":"0'0","created":18,"last_epoch_clean":19,"parent":"0.0","parent_split_bits":0,"last_scrub":"248'5903","last_scrub_stamp":"2026-03-23T18:34:27.202884+0000","last_deep_scrub":"248'5903","last_deep_scrub_stamp":"2026-03-23T18:34:27.202884+0000","last_clean_scrub_stamp":"2026-03-23T18:34:27.202884+0000","objects_scrubbed":255,"log_size":5903,"log_dups_size":0,"ondisk_log_size":5903,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":2,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T21:02:01.894329+0000","scrub_duration":1273,"objects_trimmed":0,"snaptrim_duration":0.000212728,"stat_sum":{"num_bytes":1065353216,"num_objects":255,"num_object_clones":0,"num_object_copies":765,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":255,"num_whiteouts":0,"num_read":511,"num_read_kb":1613824,"num_write":5939,"num_write_kb":3420754,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,2,1],"acting":[0,2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.2","version":"249'3918","reported_seq":10726,"reported_epoch":249,"state":"active+clean","last_fresh":"2026-03-23T18:34:23.957557+0000","last_change":"2026-03-23T18:34:23.957557+0000","last_active":"2026-03-23T18:34:23.957557+0000","last_peered":"2026-03-23T18:34:23.957557+0000","last_clean":"2026-03-23T18:34:23.957557+0000","last_became_active":"2026-03-23T17:30:49.499065+0000","last_became_peered":"2026-03-23T17:30:49.499065+0000","last_unstale":"2026-03-23T18:34:23.957557+0000","last_undegraded":"2026-03-23T18:34:23.957557+0000","last_fullsized":"2026-03-23T18:34:23.957557+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"249'3918","last_scrub_stamp":"2026-03-23T18:34:23.957522+0000","last_deep_scrub":"249'3918","last_deep_scrub_stamp":"2026-03-23T18:34:23.957522+0000","last_clean_scrub_stamp":"2026-03-23T18:34:23.957522+0000","objects_scrubbed":2,"log_size":3918,"log_dups_size":0,"ondisk_log_size":3918,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":1,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T03:36:44.133784+0000","scrub_duration":9,"objects_trimmed":0,"snaptrim_duration":0.000806078,"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":17873,"num_read_kb":15444,"num_write":4416,"num_write_kb":6280,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,1],"acting":[0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"3.0","version":"248'6283","reported_seq":7614,"reported_epoch":249,"state":"active+clean","last_fresh":"2026-03-23T18:34:31.427102+0000","last_change":"2026-03-23T18:34:31.427102+0000","last_active":"2026-03-23T18:34:31.427102+0000","last_peered":"2026-03-23T18:34:31.427102+0000","last_clean":"2026-03-23T18:34:31.427102+0000","last_became_active":"2026-03-23T17:30:59.777274+0000","last_became_peered":"2026-03-23T17:30:59.777274+0000","last_unstale":"2026-03-23T18:34:31.427102+0000","last_undegraded":"2026-03-23T18:34:31.427102+0000","last_fullsized":"2026-03-23T18:34:31.427102+0000","mapping_epoch":18,"log_start":"0'0","ondisk_log_start":"0'0","created":18,"last_epoch_clean":19,"parent":"0.0","parent_split_bits":0,"last_scrub":"248'6283","last_scrub_stamp":"2026-03-23T18:34:31.427042+0000","last_deep_scrub":"248'6283","last_deep_scrub_stamp":"2026-03-23T18:34:31.427042+0000","last_clean_scrub_stamp":"2026-03-23T18:34:31.427042+0000","objects_scrubbed":234,"log_size":6283,"log_dups_size":0,"ondisk_log_size":6283,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":2,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T00:40:17.214958+0000","scrub_duration":1045,"objects_trimmed":0,"snaptrim_duration":0.00075581400000000005,"stat_sum":{"num_bytes":981467136,"num_objects":234,"num_object_clones":0,"num_object_copies":702,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":234,"num_whiteouts":0,"num_read":590,"num_read_kb":1814159,"num_write":6318,"num_write_kb":2953208,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2,0],"acting":[1,2,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.1","version":"249'6572","reported_seq":10603,"reported_epoch":249,"state":"active+clean","last_fresh":"2026-03-23T18:34:25.048929+0000","last_change":"2026-03-23T18:34:25.048929+0000","last_active":"2026-03-23T18:34:25.048929+0000","last_peered":"2026-03-23T18:34:25.048929+0000","last_clean":"2026-03-23T18:34:25.048929+0000","last_became_active":"2026-03-23T17:30:49.499313+0000","last_became_peered":"2026-03-23T17:30:49.499313+0000","last_unstale":"2026-03-23T18:34:25.048929+0000","last_undegraded":"2026-03-23T18:34:25.048929+0000","last_fullsized":"2026-03-23T18:34:25.048929+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"249'6572","last_scrub_stamp":"2026-03-23T18:34:25.048789+0000","last_deep_scrub":"249'6572","last_deep_scrub_stamp":"2026-03-23T18:34:25.048789+0000","last_clean_scrub_stamp":"2026-03-23T18:34:25.048789+0000","objects_scrubbed":2,"log_size":6572,"log_dups_size":0,"ondisk_log_size":6572,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":1,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T01:50:06.551184+0000","scrub_duration":9,"objects_trimmed":0,"snaptrim_duration":0.00044566499999999998,"stat_sum":{"num_bytes":0,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":31326,"num_read_kb":26121,"num_write":9933,"num_write_kb":11655,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"3.1","version":"248'5790","reported_seq":7056,"reported_epoch":249,"state":"active+clean","last_fresh":"2026-03-23T18:34:26.217503+0000","last_change":"2026-03-23T18:34:26.217503+0000","last_active":"2026-03-23T18:34:26.217503+0000","last_peered":"2026-03-23T18:34:26.217503+0000","last_clean":"2026-03-23T18:34:26.217503+0000","last_became_active":"2026-03-23T17:30:59.775767+0000","last_became_peered":"2026-03-23T17:30:59.775767+0000","last_unstale":"2026-03-23T18:34:26.217503+0000","last_undegraded":"2026-03-23T18:34:26.217503+0000","last_fullsized":"2026-03-23T18:34:26.217503+0000","mapping_epoch":18,"log_start":"0'0","ondisk_log_start":"0'0","created":18,"last_epoch_clean":19,"parent":"0.0","parent_split_bits":0,"last_scrub":"248'5790","last_scrub_stamp":"2026-03-23T18:34:26.217434+0000","last_deep_scrub":"248'5790","last_deep_scrub_stamp":"2026-03-23T18:34:26.217434+0000","last_clean_scrub_stamp":"2026-03-23T18:34:26.217434+0000","objects_scrubbed":268,"log_size":5790,"log_dups_size":0,"ondisk_log_size":5790,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":2,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T03:03:30.655917+0000","scrub_duration":1289,"objects_trimmed":0,"snaptrim_duration":0.00026309199999999998,"stat_sum":{"num_bytes":1124073472,"num_objects":268,"num_object_clones":0,"num_object_copies":804,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":268,"num_whiteouts":0,"num_read":552,"num_read_kb":1617042,"num_write":5876,"num_write_kb":3343840,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,2,1],"acting":[0,2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.0","version":"240'3582","reported_seq":7854,"reported_epoch":249,"state":"active+clean","last_fresh":"2026-03-23T18:34:26.016102+0000","last_change":"2026-03-23T18:34:26.016102+0000","last_active":"2026-03-23T18:34:26.016102+0000","last_peered":"2026-03-23T18:34:26.016102+0000","last_clean":"2026-03-23T18:34:26.016102+0000","last_became_active":"2026-03-23T17:30:49.499316+0000","last_became_peered":"2026-03-23T17:30:49.499316+0000","last_unstale":"2026-03-23T18:34:26.016102+0000","last_undegraded":"2026-03-23T18:34:26.016102+0000","last_fullsized":"2026-03-23T18:34:26.016102+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"240'3582","last_scrub_stamp":"2026-03-23T18:34:26.016064+0000","last_deep_scrub":"240'3582","last_deep_scrub_stamp":"2026-03-23T18:34:26.016064+0000","last_clean_scrub_stamp":"2026-03-23T18:34:26.016064+0000","objects_scrubbed":0,"log_size":3582,"log_dups_size":0,"ondisk_log_size":3582,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T02:15:14.830039+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00040956599999999999,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":8802,"num_read_kb":7268,"num_write":2362,"num_write_kb":4390,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"3.2","version":"248'6121","reported_seq":7772,"reported_epoch":249,"state":"active+clean","last_fresh":"2026-03-23T18:34:32.642736+0000","last_change":"2026-03-23T18:34:32.642736+0000","last_active":"2026-03-23T18:34:32.642736+0000","last_peered":"2026-03-23T18:34:32.642736+0000","last_clean":"2026-03-23T18:34:32.642736+0000","last_became_active":"2026-03-23T17:30:59.775868+0000","last_became_peered":"2026-03-23T17:30:59.775868+0000","last_unstale":"2026-03-23T18:34:32.642736+0000","last_undegraded":"2026-03-23T18:34:32.642736+0000","last_fullsized":"2026-03-23T18:34:32.642736+0000","mapping_epoch":18,"log_start":"0'0","ondisk_log_start":"0'0","created":18,"last_epoch_clean":19,"parent":"0.0","parent_split_bits":0,"last_scrub":"248'6121","last_scrub_stamp":"2026-03-23T18:34:32.642667+0000","last_deep_scrub":"248'6121","last_deep_scrub_stamp":"2026-03-23T18:34:32.642667+0000","last_clean_scrub_stamp":"2026-03-23T18:34:32.642667+0000","objects_scrubbed":269,"log_size":6121,"log_dups_size":0,"ondisk_log_size":6121,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":2,"scrub_schedule":"periodic scrub scheduled @ 2026-03-24T20:40:16.212054+0000","scrub_duration":1233,"objects_trimmed":0,"snaptrim_duration":0.00085092399999999997,"stat_sum":{"num_bytes":1124073491,"num_objects":269,"num_object_clones":0,"num_object_copies":807,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":269,"num_whiteouts":0,"num_read":929,"num_read_kb":1747411,"num_write":6165,"num_write_kb":3360461,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2,0],"acting":[1,2,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.3","version":"249'4978","reported_seq":11034,"reported_epoch":249,"state":"active+clean","last_fresh":"2026-03-23T18:34:27.388905+0000","last_change":"2026-03-23T18:34:27.388905+0000","last_active":"2026-03-23T18:34:27.388905+0000","last_peered":"2026-03-23T18:34:27.388905+0000","last_clean":"2026-03-23T18:34:27.388905+0000","last_became_active":"2026-03-23T17:30:49.498440+0000","last_became_peered":"2026-03-23T17:30:49.498440+0000","last_unstale":"2026-03-23T18:34:27.388905+0000","last_undegraded":"2026-03-23T18:34:27.388905+0000","last_fullsized":"2026-03-23T18:34:27.388905+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"249'4978","last_scrub_stamp":"2026-03-23T18:34:27.388871+0000","last_deep_scrub":"249'4978","last_deep_scrub_stamp":"2026-03-23T18:34:27.388871+0000","last_clean_scrub_stamp":"2026-03-23T18:34:27.388871+0000","objects_scrubbed":1,"log_size":4978,"log_dups_size":0,"ondisk_log_size":4978,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":1,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T03:09:10.679087+0000","scrub_duration":9,"objects_trimmed":0,"snaptrim_duration":0.00056242200000000003,"stat_sum":{"num_bytes":0,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":11754,"num_read_kb":10177,"num_write":4143,"num_write_kb":6106,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":1,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2],"acting":[1,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"1.0","version":"249'34","reported_seq":571,"reported_epoch":249,"state":"active+clean","last_fresh":"2026-03-23T18:34:24.323068+0000","last_change":"2026-03-23T18:34:24.323068+0000","last_active":"2026-03-23T18:34:24.323068+0000","last_peered":"2026-03-23T18:34:24.323068+0000","last_clean":"2026-03-23T18:34:24.323068+0000","last_became_active":"2026-03-23T17:30:46.634518+0000","last_became_peered":"2026-03-23T17:30:46.634518+0000","last_unstale":"2026-03-23T18:34:24.323068+0000","last_undegraded":"2026-03-23T18:34:24.323068+0000","last_fullsized":"2026-03-23T18:34:24.323068+0000","mapping_epoch":9,"log_start":"0'0","ondisk_log_start":"0'0","created":9,"last_epoch_clean":10,"parent":"0.0","parent_split_bits":0,"last_scrub":"249'34","last_scrub_stamp":"2026-03-23T18:34:24.323032+0000","last_deep_scrub":"249'34","last_deep_scrub_stamp":"2026-03-23T18:34:24.323032+0000","last_clean_scrub_stamp":"2026-03-23T18:34:24.323032+0000","objects_scrubbed":2,"log_size":34,"log_dups_size":0,"ondisk_log_size":34,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":1,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T03:56:03.205059+0000","scrub_duration":5,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]}],"pool_stats":[{"poolid":3,"num_pg":4,"stat_sum":{"num_bytes":4294967315,"num_objects":1026,"num_object_clones":0,"num_object_copies":3078,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1026,"num_whiteouts":0,"num_read":2582,"num_read_kb":6792436,"num_write":24298,"num_write_kb":13078263,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":3221237760,"data_stored":6442463232,"data_compressed":27525120,"data_compressed_allocated":3221225472,"data_compressed_original":6442450944,"omap_allocated":0,"internal_metadata":0},"log_size":24097,"ondisk_log_size":24097,"up":12,"acting":12,"num_store_stats":3},{"poolid":2,"num_pg":8,"stat_sum":{"num_bytes":27,"num_objects":7,"num_object_clones":0,"num_object_copies":14,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":7,"num_whiteouts":0,"num_read":107339,"num_read_kb":90648,"num_write":31280,"num_write_kb":45527,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":7,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":16384,"data_stored":54,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1631498,"internal_metadata":0},"log_size":33179,"ondisk_log_size":33179,"up":16,"acting":16,"num_store_stats":3},{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":483328,"data_stored":918560,"data_compressed":4658,"data_compressed_allocated":442368,"data_compressed_original":884736,"omap_allocated":0,"internal_metadata":0},"log_size":34,"ondisk_log_size":34,"up":2,"acting":2,"num_store_stats":2}],"osd_stats":[{"osd":2,"up_from":8,"seq":34359739139,"num_pgs":7,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":2485680,"kb_used_data":1051104,"kb_used_omap":397,"kb_used_meta":1434162,"kb_avail":91886160,"statfs":{"total":96636764160,"available":94091427840,"internally_reserved":0,"allocated":1076330496,"data_stored":2148635429,"data_compressed":9175040,"data_compressed_allocated":1073741824,"data_compressed_original":2147483648,"omap_allocated":406832,"internal_metadata":1468582608},"hb_peers":[0,1],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]},{"osd":1,"up_from":8,"seq":34359739140,"num_pgs":13,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":2197092,"kb_used_data":1051348,"kb_used_omap":746,"kb_used_meta":1144981,"kb_avail":92174748,"statfs":{"total":96636764160,"available":94386941952,"internally_reserved":0,"allocated":1076580352,"data_stored":2149094736,"data_compressed":9177369,"data_compressed_allocated":1073963008,"data_compressed_original":2147926016,"omap_allocated":764706,"internal_metadata":1172460766},"hb_peers":[0,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]},{"osd":0,"up_from":8,"seq":34359739137,"num_pgs":10,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":2251236,"kb_used_data":1051348,"kb_used_omap":515,"kb_used_meta":1199356,"kb_avail":92120604,"statfs":{"total":96636764160,"available":94331498496,"internally_reserved":0,"allocated":1076580352,"data_stored":2149094736,"data_compressed":9177369,"data_compressed_allocated":1073963008,"data_compressed_original":2147926016,"omap_allocated":527948,"internal_metadata":1228140980},"hb_peers":[1,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":241664,"data_stored":459280,"data_compressed":2329,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":241664,"data_stored":459280,"data_compressed":2329,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":27,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":526358,"internal_metadata":0},{"poolid":2,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":27,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":736558,"internal_metadata":0},{"poolid":2,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":368582,"internal_metadata":0},{"poolid":3,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":1073745920,"data_stored":2147487744,"data_compressed":9175040,"data_compressed_allocated":1073741824,"data_compressed_original":2147483648,"omap_allocated":0,"internal_metadata":0},{"poolid":3,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":1073745920,"data_stored":2147487744,"data_compressed":9175040,"data_compressed_allocated":1073741824,"data_compressed_original":2147483648,"omap_allocated":0,"internal_metadata":0},{"poolid":3,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":1073745920,"data_stored":2147487744,"data_compressed":9175040,"data_compressed_allocated":1073741824,"data_compressed_original":2147483648,"omap_allocated":0,"internal_metadata":0}]}} 2026-03-23T18:34:44.442 DEBUG:teuthology.orchestra.run.vm04:> sudo ceph --cluster ceph config set global mon_health_to_clog false 2026-03-23T18:34:44.614 INFO:teuthology.misc:Shutting down mds daemons... 2026-03-23T18:34:44.614 INFO:teuthology.misc:Shutting down osd daemons... 2026-03-23T18:34:44.615 DEBUG:tasks.ceph.osd.0:waiting for process to exit 2026-03-23T18:34:44.615 INFO:teuthology.orchestra.run:waiting for 300 2026-03-23T18:34:47.229 INFO:tasks.daemonwatchdog.daemon_watchdog:daemon ceph.osd.0 is failed for ~0s 2026-03-23T18:34:50.716 INFO:tasks.ceph.osd.0:Stopped 2026-03-23T18:34:50.716 DEBUG:tasks.ceph.osd.1:waiting for process to exit 2026-03-23T18:34:50.716 INFO:teuthology.orchestra.run:waiting for 300 2026-03-23T18:34:52.531 INFO:tasks.daemonwatchdog.daemon_watchdog:daemon ceph.osd.1 is failed for ~0s 2026-03-23T18:34:52.531 INFO:tasks.daemonwatchdog.daemon_watchdog:daemon ceph.osd.0 has been restored 2026-03-23T18:34:56.817 INFO:tasks.ceph.osd.1:Stopped 2026-03-23T18:34:56.817 DEBUG:tasks.ceph.osd.2:waiting for process to exit 2026-03-23T18:34:56.817 INFO:teuthology.orchestra.run:waiting for 300 2026-03-23T18:34:56.891 INFO:tasks.ceph.osd.2:Stopped 2026-03-23T18:34:56.891 INFO:teuthology.misc:Shutting down mgr daemons... 2026-03-23T18:34:56.891 DEBUG:tasks.ceph.mgr.x:waiting for process to exit 2026-03-23T18:34:56.891 INFO:teuthology.orchestra.run:waiting for 300 2026-03-23T18:34:56.921 INFO:tasks.ceph.mgr.x:Stopped 2026-03-23T18:34:56.921 INFO:teuthology.misc:Shutting down mon daemons... 2026-03-23T18:34:56.921 DEBUG:tasks.ceph.mon.a:waiting for process to exit 2026-03-23T18:34:56.921 INFO:teuthology.orchestra.run:waiting for 300 2026-03-23T18:34:56.973 INFO:tasks.ceph.mon.a:Stopped 2026-03-23T18:34:56.974 INFO:tasks.ceph:Checking cluster log for badness... 2026-03-23T18:34:56.974 DEBUG:teuthology.orchestra.run.vm04:> sudo egrep '\[ERR\]|\[WRN\]|\[SEC\]' /var/log/ceph/ceph.log | egrep -v '\(MDS_ALL_DOWN\)' | egrep -v '\(MDS_UP_LESS_THAN_MAX\)' | egrep -v '\(OSD_SLOW_PING_TIME' | head -n 1 2026-03-23T18:34:57.028 INFO:tasks.ceph:Unmounting /var/lib/ceph/osd/ceph-0 on ubuntu@vm04.local 2026-03-23T18:34:57.028 DEBUG:teuthology.orchestra.run.vm04:> sync && sudo umount -f /var/lib/ceph/osd/ceph-0 2026-03-23T18:34:57.136 INFO:tasks.ceph:Unmounting /var/lib/ceph/osd/ceph-1 on ubuntu@vm04.local 2026-03-23T18:34:57.136 DEBUG:teuthology.orchestra.run.vm04:> sync && sudo umount -f /var/lib/ceph/osd/ceph-1 2026-03-23T18:34:57.182 INFO:tasks.ceph:Unmounting /var/lib/ceph/osd/ceph-2 on ubuntu@vm04.local 2026-03-23T18:34:57.182 DEBUG:teuthology.orchestra.run.vm04:> sync && sudo umount -f /var/lib/ceph/osd/ceph-2 2026-03-23T18:34:57.285 INFO:tasks.ceph:Archiving mon data... 2026-03-23T18:34:57.285 DEBUG:teuthology.misc:Transferring archived files from vm04:/var/lib/ceph/mon/ceph-a to /archive/kyr-2026-03-20_22:04:26-rbd-tentacle-none-default-vps/3501/data/mon.a.tgz 2026-03-23T18:34:57.286 DEBUG:teuthology.orchestra.run.vm04:> mktemp 2026-03-23T18:34:57.289 INFO:teuthology.orchestra.run.vm04.stdout:/tmp/tmp.2jUUnhnznR 2026-03-23T18:34:57.289 DEBUG:teuthology.orchestra.run.vm04:> sudo tar cz -f - -C /var/lib/ceph/mon/ceph-a -- . > /tmp/tmp.2jUUnhnznR 2026-03-23T18:34:57.398 DEBUG:teuthology.orchestra.run.vm04:> sudo chmod 0666 /tmp/tmp.2jUUnhnznR 2026-03-23T18:34:57.453 DEBUG:teuthology.orchestra.remote:vm04:/tmp/tmp.2jUUnhnznR is 472KB 2026-03-23T18:34:57.503 DEBUG:teuthology.orchestra.run.vm04:> rm -fr /tmp/tmp.2jUUnhnznR 2026-03-23T18:34:57.506 INFO:tasks.ceph:Cleaning ceph cluster... 2026-03-23T18:34:57.506 DEBUG:teuthology.orchestra.run.vm04:> sudo rm -rf -- /etc/ceph/ceph.conf /etc/ceph/ceph.keyring /home/ubuntu/cephtest/ceph.data /home/ubuntu/cephtest/ceph.monmap /home/ubuntu/cephtest/../*.pid 2026-03-23T18:34:57.531 INFO:tasks.daemonwatchdog.daemon_watchdog:daemon ceph.osd.1 has been restored 2026-03-23T18:34:57.597 INFO:teuthology.util.scanner:summary_data or yaml_file is empty! 2026-03-23T18:34:57.597 INFO:tasks.ceph:Archiving crash dumps... 2026-03-23T18:34:57.597 DEBUG:teuthology.misc:Transferring archived files from vm04:/var/lib/ceph/crash to /archive/kyr-2026-03-20_22:04:26-rbd-tentacle-none-default-vps/3501/remote/vm04/crash 2026-03-23T18:34:57.598 DEBUG:teuthology.orchestra.run.vm04:> sudo tar c -f - -C /var/lib/ceph/crash -- . 2026-03-23T18:34:57.646 INFO:tasks.ceph:Compressing logs... 2026-03-23T18:34:57.646 DEBUG:teuthology.orchestra.run.vm04:> time sudo find /var/log/ceph -name '*.log' -print0 | sudo xargs --max-args=1 --max-procs=0 --verbose -0 --no-run-if-empty -- gzip -5 --verbose -- 2026-03-23T18:34:57.700 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41765.log 2026-03-23T18:34:57.700 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53503.log 2026-03-23T18:34:57.700 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.41765.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41765.log.gz 2026-03-23T18:34:57.700 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82223.log 2026-03-23T18:34:57.701 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40257.log 2026-03-23T18:34:57.701 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.53503.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53503.log.gz 2026-03-23T18:34:57.701 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80726.log 2026-03-23T18:34:57.701 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82223.log: /var/log/ceph/ceph-client.admin.40257.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83493.log 2026-03-23T18:34:57.701 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40257.log.gz 2026-03-23T18:34:57.701 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64931.log 2026-03-23T18:34:57.701 INFO:teuthology.orchestra.run.vm04.stderr: 56.9% -- replaced with /var/log/ceph/ceph-client.admin.82223.log.gz 2026-03-23T18:34:57.701 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80726.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80726.log.gz 2026-03-23T18:34:57.701 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38081.log 2026-03-23T18:34:57.702 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83493.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83493.log.gz 2026-03-23T18:34:57.702 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64931.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64931.log.gz 2026-03-23T18:34:57.702 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48297.log 2026-03-23T18:34:57.702 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60972.log 2026-03-23T18:34:57.702 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.38081.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85676.log 2026-03-23T18:34:57.703 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.48297.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48297.log.gz 26.0% -- replaced with /var/log/ceph/ceph-client.admin.38081.log.gz 2026-03-23T18:34:57.703 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60972.log: 2026-03-23T18:34:57.703 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60972.log.gz 2026-03-23T18:34:57.703 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68651.log 2026-03-23T18:34:57.703 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85676.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45452.log 2026-03-23T18:34:57.703 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85676.log.gz 2026-03-23T18:34:57.703 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45798.log 2026-03-23T18:34:57.704 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68651.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68651.log.gz 2026-03-23T18:34:57.704 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.45452.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45452.log.gz 2026-03-23T18:34:57.704 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69456.log 2026-03-23T18:34:57.704 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59222.log 2026-03-23T18:34:57.704 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.45798.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45798.log.gz 2026-03-23T18:34:57.704 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25550.log 2026-03-23T18:34:57.704 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69456.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69456.log.gz 2026-03-23T18:34:57.704 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59222.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59222.log.gz 2026-03-23T18:34:57.704 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37766.log 2026-03-23T18:34:57.705 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55623.log 2026-03-23T18:34:57.705 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.25550.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25550.log.gz 2026-03-23T18:34:57.705 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74380.log 2026-03-23T18:34:57.705 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37766.log: /var/log/ceph/ceph-client.admin.55623.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55623.log.gz 2026-03-23T18:34:57.705 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35413.log 2026-03-23T18:34:57.705 INFO:teuthology.orchestra.run.vm04.stderr: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.37766.log.gz 2026-03-23T18:34:57.705 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63785.log 2026-03-23T18:34:57.706 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74380.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74380.log.gz 2026-03-23T18:34:57.706 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31023.log 2026-03-23T18:34:57.706 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.35413.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78000.log 2026-03-23T18:34:57.706 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63785.log: 0.0% 26.5% -- replaced with /var/log/ceph/ceph-client.admin.35413.log.gz 2026-03-23T18:34:57.706 INFO:teuthology.orchestra.run.vm04.stderr: -- replaced with /var/log/ceph/ceph-client.admin.63785.log.gz 2026-03-23T18:34:57.706 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31023.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31023.log.gz 2026-03-23T18:34:57.706 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45194.log 2026-03-23T18:34:57.707 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78000.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39276.log 2026-03-23T18:34:57.707 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78000.log.gz 2026-03-23T18:34:57.707 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30796.log 2026-03-23T18:34:57.707 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.45194.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45194.log.gz 2026-03-23T18:34:57.707 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.39276.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64067.log 2026-03-23T18:34:57.707 INFO:teuthology.orchestra.run.vm04.stderr: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.39276.log.gz 2026-03-23T18:34:57.707 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.30796.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75830.log 2026-03-23T18:34:57.707 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30796.log.gz 2026-03-23T18:34:57.708 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67899.log 2026-03-23T18:34:57.708 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64067.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64067.log.gz 2026-03-23T18:34:57.708 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75830.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75830.log.gz 2026-03-23T18:34:57.708 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46581.log 2026-03-23T18:34:57.708 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52012.log 2026-03-23T18:34:57.708 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67899.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67899.log.gz 2026-03-23T18:34:57.708 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33028.log 2026-03-23T18:34:57.709 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46581.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46581.log.gz 2026-03-23T18:34:57.709 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.52012.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45970.log 2026-03-23T18:34:57.709 INFO:teuthology.orchestra.run.vm04.stderr: 58.1% -- replaced with /var/log/ceph/ceph-client.admin.52012.log.gz 2026-03-23T18:34:57.709 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33028.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75325.log 2026-03-23T18:34:57.709 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33028.log.gz 2026-03-23T18:34:57.709 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64471.log 2026-03-23T18:34:57.709 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.45970.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45970.log.gz/var/log/ceph/ceph-client.admin.75325.log: 2026-03-23T18:34:57.710 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75325.log.gz 2026-03-23T18:34:57.710 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30493.log 2026-03-23T18:34:57.710 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41919.log 2026-03-23T18:34:57.710 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64471.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64471.log.gz 2026-03-23T18:34:57.710 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33859.log 2026-03-23T18:34:57.710 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.30493.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30493.log.gz 2026-03-23T18:34:57.710 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.41919.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41919.log.gz 2026-03-23T18:34:57.710 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63648.log 2026-03-23T18:34:57.711 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37280.log 2026-03-23T18:34:57.711 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33859.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32201.log 2026-03-23T18:34:57.711 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63648.log: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.33859.log.gz 2026-03-23T18:34:57.711 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63648.log.gz/var/log/ceph/ceph-client.admin.37280.log: 2026-03-23T18:34:57.711 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78151.log 2026-03-23T18:34:57.711 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37280.log.gz 2026-03-23T18:34:57.711 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32201.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83436.log 2026-03-23T18:34:57.711 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32201.log.gz 2026-03-23T18:34:57.712 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78151.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72528.log 2026-03-23T18:34:57.712 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78151.log.gz 2026-03-23T18:34:57.712 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58752.log 2026-03-23T18:34:57.712 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83436.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83436.log.gz 2026-03-23T18:34:57.712 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72528.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72528.log.gz 2026-03-23T18:34:57.712 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66149.log 2026-03-23T18:34:57.712 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75361.log 2026-03-23T18:34:57.712 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58752.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58752.log.gz 2026-03-23T18:34:57.713 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40107.log 2026-03-23T18:34:57.713 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66149.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66149.log.gz 2026-03-23T18:34:57.713 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75361.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75361.log.gz 2026-03-23T18:34:57.713 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45172.log 2026-03-23T18:34:57.713 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51476.log 2026-03-23T18:34:57.713 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.40107.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40107.log.gz 2026-03-23T18:34:57.713 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79768.log 2026-03-23T18:34:57.713 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.45172.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45172.log.gz 2026-03-23T18:34:57.714 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.51476.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51476.log.gz 2026-03-23T18:34:57.714 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57265.log 2026-03-23T18:34:57.714 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76024.log 2026-03-23T18:34:57.714 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79768.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79768.log.gz 2026-03-23T18:34:57.714 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65847.log 2026-03-23T18:34:57.714 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.57265.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57265.log.gz 2026-03-23T18:34:57.714 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76024.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76024.log.gz 2026-03-23T18:34:57.714 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32575.log 2026-03-23T18:34:57.715 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77567.log 2026-03-23T18:34:57.715 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65847.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65847.log.gz 2026-03-23T18:34:57.715 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69017.log 2026-03-23T18:34:57.715 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32575.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32575.log.gz 2026-03-23T18:34:57.715 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77567.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77567.log.gz 2026-03-23T18:34:57.715 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50009.log 2026-03-23T18:34:57.715 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44268.log 2026-03-23T18:34:57.716 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69017.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69017.log.gz 2026-03-23T18:34:57.716 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60355.log 2026-03-23T18:34:57.716 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50009.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50009.log.gz 2026-03-23T18:34:57.716 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44268.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76389.log 2026-03-23T18:34:57.716 INFO:teuthology.orchestra.run.vm04.stderr: 25.7% -- replaced with /var/log/ceph/ceph-client.admin.44268.log.gz 2026-03-23T18:34:57.716 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79662.log 2026-03-23T18:34:57.716 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60355.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60355.log.gz 2026-03-23T18:34:57.716 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28992.log 2026-03-23T18:34:57.717 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76389.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76389.log.gz 2026-03-23T18:34:57.717 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79662.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79662.log.gz 2026-03-23T18:34:57.717 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27684.log 2026-03-23T18:34:57.717 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54191.log 2026-03-23T18:34:57.717 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.28992.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28992.log.gz 2026-03-23T18:34:57.717 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84444.log 2026-03-23T18:34:57.717 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27684.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27684.log.gz 2026-03-23T18:34:57.718 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.54191.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54191.log.gz 2026-03-23T18:34:57.718 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34699.log 2026-03-23T18:34:57.718 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34510.log 2026-03-23T18:34:57.718 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84444.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84444.log.gz 2026-03-23T18:34:57.718 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34699.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67708.log 2026-03-23T18:34:57.719 INFO:teuthology.orchestra.run.vm04.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.34699.log.gz 2026-03-23T18:34:57.719 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34510.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71286.log 2026-03-23T18:34:57.719 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34510.log.gz 2026-03-23T18:34:57.719 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67708.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62659.log 2026-03-23T18:34:57.719 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67708.log.gz 2026-03-23T18:34:57.719 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71286.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68608.log 2026-03-23T18:34:57.719 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71286.log.gz 2026-03-23T18:34:57.719 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62659.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77592.log 2026-03-23T18:34:57.719 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62659.log.gz 2026-03-23T18:34:57.720 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68608.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77080.log 2026-03-23T18:34:57.720 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68608.log.gz 2026-03-23T18:34:57.720 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45409.log 2026-03-23T18:34:57.720 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77592.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77592.log.gz 2026-03-23T18:34:57.720 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77080.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.77080.log.gz -5 2026-03-23T18:34:57.720 INFO:teuthology.orchestra.run.vm04.stderr: --verbose -- /var/log/ceph/ceph-client.admin.75637.log 2026-03-23T18:34:57.721 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.45409.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89766.log 2026-03-23T18:34:57.721 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45409.log.gz 2026-03-23T18:34:57.721 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37577.log 2026-03-23T18:34:57.721 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75637.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75637.log.gz 2026-03-23T18:34:57.721 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89766.log: gzip -5 --verbose -- 0.0% /var/log/ceph/ceph.audit.log 2026-03-23T18:34:57.721 INFO:teuthology.orchestra.run.vm04.stderr: -- replaced with /var/log/ceph/ceph-client.admin.89766.log.gz 2026-03-23T18:34:57.721 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37577.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77101.log 2026-03-23T18:34:57.722 INFO:teuthology.orchestra.run.vm04.stderr: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.37577.log.gz 2026-03-23T18:34:57.722 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph.audit.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54583.log 2026-03-23T18:34:57.722 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77101.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36090.log 2026-03-23T18:34:57.722 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77101.log.gz 2026-03-23T18:34:57.722 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.54583.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82914.log 2026-03-23T18:34:57.722 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54583.log.gz 2026-03-23T18:34:57.722 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37263.log 2026-03-23T18:34:57.722 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36090.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36090.log.gz 2026-03-23T18:34:57.723 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82914.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82914.log.gz 2026-03-23T18:34:57.723 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38228.log 2026-03-23T18:34:57.723 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37263.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69210.log 2026-03-23T18:34:57.723 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37263.log.gz 2026-03-23T18:34:57.723 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62111.log 2026-03-23T18:34:57.723 INFO:teuthology.orchestra.run.vm04.stderr: 89.9% -- replaced with /var/log/ceph/ceph.audit.log.gz 2026-03-23T18:34:57.723 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.38228.log: /var/log/ceph/ceph-client.admin.69210.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69210.log.gz 2026-03-23T18:34:57.723 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68565.log 2026-03-23T18:34:57.723 INFO:teuthology.orchestra.run.vm04.stderr: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.38228.log.gz 2026-03-23T18:34:57.724 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29290.log 2026-03-23T18:34:57.724 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62111.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62111.log.gz 2026-03-23T18:34:57.724 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68565.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68565.log.gz 2026-03-23T18:34:57.724 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41017.log 2026-03-23T18:34:57.724 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44979.log 2026-03-23T18:34:57.724 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29290.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29290.log.gz 2026-03-23T18:34:57.724 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60725.log 2026-03-23T18:34:57.725 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.41017.log: /var/log/ceph/ceph-client.admin.44979.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44979.log.gz 2026-03-23T18:34:57.725 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83474.log 2026-03-23T18:34:57.725 INFO:teuthology.orchestra.run.vm04.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.41017.log.gz 2026-03-23T18:34:57.725 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81696.log 2026-03-23T18:34:57.725 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60725.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60725.log.gz 2026-03-23T18:34:57.725 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87208.log 2026-03-23T18:34:57.725 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83474.log: /var/log/ceph/ceph-client.admin.81696.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83474.log.gz 2026-03-23T18:34:57.725 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58483.log 2026-03-23T18:34:57.725 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81696.log.gz 2026-03-23T18:34:57.726 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87208.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46509.log 2026-03-23T18:34:57.726 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87208.log.gz 2026-03-23T18:34:57.726 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75766.log 2026-03-23T18:34:57.726 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58483.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58483.log.gz 2026-03-23T18:34:57.726 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46509.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46509.log.gz 2026-03-23T18:34:57.726 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79463.log 2026-03-23T18:34:57.727 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75766.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75766.log.gz 2026-03-23T18:34:57.727 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27304.log 2026-03-23T18:34:57.727 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91264.log 2026-03-23T18:34:57.727 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79463.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79463.log.gz 2026-03-23T18:34:57.727 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27304.log: gzip -5 --verbose -- 0.0% /var/log/ceph/ceph-client.admin.32524.log 2026-03-23T18:34:57.727 INFO:teuthology.orchestra.run.vm04.stderr: -- replaced with /var/log/ceph/ceph-client.admin.27304.log.gz 2026-03-23T18:34:57.727 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91264.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83729.log 2026-03-23T18:34:57.728 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91264.log.gz 2026-03-23T18:34:57.728 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47957.log 2026-03-23T18:34:57.728 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32524.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32524.log.gz 2026-03-23T18:34:57.728 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83729.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.83729.log.gz -5 2026-03-23T18:34:57.728 INFO:teuthology.orchestra.run.vm04.stderr: --verbose -- /var/log/ceph/ceph-client.admin.68716.log 2026-03-23T18:34:57.728 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.47957.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76779.log 2026-03-23T18:34:57.728 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47957.log.gz 2026-03-23T18:34:57.729 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58590.log 2026-03-23T18:34:57.729 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68716.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68716.log.gz 2026-03-23T18:34:57.729 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76779.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78022.log 2026-03-23T18:34:57.729 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76779.log.gz 2026-03-23T18:34:57.729 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58590.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70775.log 2026-03-23T18:34:57.729 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58590.log.gz 2026-03-23T18:34:57.729 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49299.log 2026-03-23T18:34:57.729 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78022.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78022.log.gz 2026-03-23T18:34:57.729 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70775.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70775.log.gz 2026-03-23T18:34:57.730 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63417.log 2026-03-23T18:34:57.730 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49299.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63842.log 2026-03-23T18:34:57.730 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49299.log.gz 2026-03-23T18:34:57.730 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39297.log 2026-03-23T18:34:57.730 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63417.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63417.log.gz 2026-03-23T18:34:57.730 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63842.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63126.log 2026-03-23T18:34:57.730 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63842.log.gz 2026-03-23T18:34:57.731 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.39297.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54665.log 2026-03-23T18:34:57.731 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63126.log: 25.3% -- replaced with /var/log/ceph/ceph-client.admin.39297.log.gz 2026-03-23T18:34:57.731 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63126.log.gz 2026-03-23T18:34:57.731 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27436.log 2026-03-23T18:34:57.731 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.54665.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74001.log 2026-03-23T18:34:57.731 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54665.log.gz 2026-03-23T18:34:57.732 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27436.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27436.log.gz 2026-03-23T18:34:57.732 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86474.log 2026-03-23T18:34:57.732 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85999.log 2026-03-23T18:34:57.732 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74001.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74001.log.gz 2026-03-23T18:34:57.732 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64909.log 2026-03-23T18:34:57.732 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86474.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86474.log.gz 2026-03-23T18:34:57.732 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85999.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85999.log.gz 2026-03-23T18:34:57.732 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49064.log 2026-03-23T18:34:57.733 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42099.log 2026-03-23T18:34:57.733 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64909.log: 58.6% -- replaced with /var/log/ceph/ceph-client.admin.64909.log.gz 2026-03-23T18:34:57.733 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39549.log 2026-03-23T18:34:57.733 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49064.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49064.log.gz/var/log/ceph/ceph-client.admin.42099.log: 2026-03-23T18:34:57.733 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84245.log 2026-03-23T18:34:57.733 INFO:teuthology.orchestra.run.vm04.stderr: 54.9% -- replaced with /var/log/ceph/ceph-client.admin.42099.log.gz 2026-03-23T18:34:57.733 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.39549.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70646.log 2026-03-23T18:34:57.734 INFO:teuthology.orchestra.run.vm04.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.39549.log.gz 2026-03-23T18:34:57.734 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84245.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74211.log 2026-03-23T18:34:57.734 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84245.log.gz 2026-03-23T18:34:57.734 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70646.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67113.log 2026-03-23T18:34:57.734 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70646.log.gz 2026-03-23T18:34:57.734 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75498.log 2026-03-23T18:34:57.734 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74211.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74211.log.gz 2026-03-23T18:34:57.734 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67113.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61230.log 2026-03-23T18:34:57.734 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67113.log.gz 2026-03-23T18:34:57.735 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60295.log 2026-03-23T18:34:57.735 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75498.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75498.log.gz 2026-03-23T18:34:57.735 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70194.log 2026-03-23T18:34:57.735 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61230.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61230.log.gz 2026-03-23T18:34:57.735 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60295.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60295.log.gz 2026-03-23T18:34:57.735 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43890.log 2026-03-23T18:34:57.735 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81904.log 2026-03-23T18:34:57.736 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70194.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70194.log.gz 2026-03-23T18:34:57.736 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69060.log 2026-03-23T18:34:57.736 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43890.log: /var/log/ceph/ceph-client.admin.81904.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.81904.log.gz -5 2026-03-23T18:34:57.736 INFO:teuthology.orchestra.run.vm04.stderr: --verbose -- /var/log/ceph/ceph-client.admin.66382.log 2026-03-23T18:34:57.736 INFO:teuthology.orchestra.run.vm04.stderr: 25.7% -- replaced with /var/log/ceph/ceph-client.admin.43890.log.gz 2026-03-23T18:34:57.736 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30450.log 2026-03-23T18:34:57.736 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69060.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69060.log.gz 2026-03-23T18:34:57.736 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43216.log 2026-03-23T18:34:57.737 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66382.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66382.log.gz 2026-03-23T18:34:57.737 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.30450.log: gzip 0.0% -5 --verbose -- replaced with /var/log/ceph/ceph-client.admin.30450.log.gz -- 2026-03-23T18:34:57.737 INFO:teuthology.orchestra.run.vm04.stderr: /var/log/ceph/ceph-client.admin.81309.log 2026-03-23T18:34:57.737 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43216.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90632.log 2026-03-23T18:34:57.737 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43216.log.gz 2026-03-23T18:34:57.737 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31744.log 2026-03-23T18:34:57.737 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81309.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81309.log.gz 2026-03-23T18:34:57.737 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90632.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42322.log 2026-03-23T18:34:57.737 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90632.log.gz 2026-03-23T18:34:57.738 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78708.log 2026-03-23T18:34:57.738 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31744.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31744.log.gz 2026-03-23T18:34:57.738 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47807.log 2026-03-23T18:34:57.738 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42322.log: /var/log/ceph/ceph-client.admin.78708.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78708.log.gz 2026-03-23T18:34:57.738 INFO:teuthology.orchestra.run.vm04.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.42322.log.gz 2026-03-23T18:34:57.738 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61832.log 2026-03-23T18:34:57.738 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49557.log 2026-03-23T18:34:57.739 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.47807.log: /var/log/ceph/ceph-client.admin.61832.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47807.log.gz 2026-03-23T18:34:57.739 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61832.log.gz 2026-03-23T18:34:57.739 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72735.log 2026-03-23T18:34:57.739 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27740.log 2026-03-23T18:34:57.739 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49557.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49557.log.gz 2026-03-23T18:34:57.739 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54444.log 2026-03-23T18:34:57.739 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72735.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72735.log.gz 2026-03-23T18:34:57.739 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27740.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27740.log.gz 2026-03-23T18:34:57.739 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85913.log 2026-03-23T18:34:57.740 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.54444.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76908.log 2026-03-23T18:34:57.740 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54444.log.gz 2026-03-23T18:34:57.740 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85913.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37850.log 2026-03-23T18:34:57.740 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85913.log.gz 2026-03-23T18:34:57.740 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76908.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65141.log 2026-03-23T18:34:57.740 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76908.log.gz 2026-03-23T18:34:57.740 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37850.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82269.log 2026-03-23T18:34:57.740 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80855.log 2026-03-23T18:34:57.741 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65141.log: 26.4% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65141.log.gz 2026-03-23T18:34:57.741 INFO:teuthology.orchestra.run.vm04.stderr: -- replaced with /var/log/ceph/ceph-client.admin.37850.log.gz 2026-03-23T18:34:57.741 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82269.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59179.log 2026-03-23T18:34:57.741 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82269.log.gz 2026-03-23T18:34:57.741 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88885.log 2026-03-23T18:34:57.741 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80855.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80855.log.gz 2026-03-23T18:34:57.741 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27261.log 2026-03-23T18:34:57.741 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59179.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59179.log.gz 2026-03-23T18:34:57.741 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88885.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37127.log 2026-03-23T18:34:57.741 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88885.log.gz 2026-03-23T18:34:57.742 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84401.log 2026-03-23T18:34:57.742 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27261.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27261.log.gz 2026-03-23T18:34:57.742 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61316.log 2026-03-23T18:34:57.742 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37127.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37127.log.gz/var/log/ceph/ceph-client.admin.84401.log: 2026-03-23T18:34:57.742 INFO:teuthology.orchestra.run.vm04.stderr:gzip 0.0% -5 --verbose -- replaced with /var/log/ceph/ceph-client.admin.84401.log.gz -- 2026-03-23T18:34:57.742 INFO:teuthology.orchestra.run.vm04.stderr: /var/log/ceph/ceph-client.admin.43504.log 2026-03-23T18:34:57.742 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61316.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67542.log 2026-03-23T18:34:57.742 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61316.log.gz 2026-03-23T18:34:57.743 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26720.log 2026-03-23T18:34:57.743 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43504.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85483.log 2026-03-23T18:34:57.743 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67542.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.43504.log.gz 2026-03-23T18:34:57.743 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67542.log.gz 2026-03-23T18:34:57.743 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.26720.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29312.log 2026-03-23T18:34:57.743 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26720.log.gz 2026-03-23T18:34:57.743 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85483.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34552.log 2026-03-23T18:34:57.743 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85483.log.gz 2026-03-23T18:34:57.743 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69949.log 2026-03-23T18:34:57.743 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29312.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29312.log.gz 2026-03-23T18:34:57.743 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30833.log 2026-03-23T18:34:57.744 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34552.log: /var/log/ceph/ceph-client.admin.69949.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28736.log 2026-03-23T18:34:57.744 INFO:teuthology.orchestra.run.vm04.stderr: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.34552.log.gz 2026-03-23T18:34:57.744 INFO:teuthology.orchestra.run.vm04.stderr: 12.0% -- replaced with /var/log/ceph/ceph-client.admin.69949.log.gz 2026-03-23T18:34:57.744 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.30833.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30833.log.gz 2026-03-23T18:34:57.744 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74630.log 2026-03-23T18:34:57.744 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.28736.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67643.log 2026-03-23T18:34:57.744 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28736.log.gz 2026-03-23T18:34:57.744 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90304.log 2026-03-23T18:34:57.745 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74630.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74630.log.gz 2026-03-23T18:34:57.745 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67643.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44513.log 2026-03-23T18:34:57.745 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67643.log.gz 2026-03-23T18:34:57.745 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90304.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62326.log 2026-03-23T18:34:57.745 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90304.log.gz 2026-03-23T18:34:57.745 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56699.log 2026-03-23T18:34:57.745 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44513.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44513.log.gz 2026-03-23T18:34:57.745 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37229.log 2026-03-23T18:34:57.745 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62326.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62326.log.gz 2026-03-23T18:34:57.746 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56699.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49686.log 2026-03-23T18:34:57.746 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56699.log.gz 2026-03-23T18:34:57.746 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67749.log 2026-03-23T18:34:57.746 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37229.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37229.log.gz 2026-03-23T18:34:57.746 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50289.log 2026-03-23T18:34:57.746 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49686.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49686.log.gz 2026-03-23T18:34:57.746 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67749.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67749.log.gz 2026-03-23T18:34:57.746 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38757.log 2026-03-23T18:34:57.746 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50289.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79575.log 2026-03-23T18:34:57.747 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50289.log.gz 2026-03-23T18:34:57.747 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50181.log 2026-03-23T18:34:57.747 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.38757.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37997.log 2026-03-23T18:34:57.747 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79575.log: 0.0% 24.5% -- replaced with /var/log/ceph/ceph-client.admin.38757.log.gz 2026-03-23T18:34:57.747 INFO:teuthology.orchestra.run.vm04.stderr: -- replaced with /var/log/ceph/ceph-client.admin.79575.log.gz 2026-03-23T18:34:57.747 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50181.log: gzip 0.0% -5 --verbose -- replaced with /var/log/ceph/ceph-client.admin.50181.log.gz -- 2026-03-23T18:34:57.747 INFO:teuthology.orchestra.run.vm04.stderr: /var/log/ceph/ceph-client.admin.62240.log 2026-03-23T18:34:57.747 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37997.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33232.log 2026-03-23T18:34:57.748 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57288.log 2026-03-23T18:34:57.748 INFO:teuthology.orchestra.run.vm04.stderr: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.37997.log.gz 2026-03-23T18:34:57.748 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62240.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62240.log.gz 2026-03-23T18:34:57.748 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33232.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66362.log 2026-03-23T18:34:57.748 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33232.log.gz 2026-03-23T18:34:57.748 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37619.log 2026-03-23T18:34:57.748 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.57288.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57288.log.gz 2026-03-23T18:34:57.748 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66362.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47058.log 2026-03-23T18:34:57.748 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66362.log.gz 2026-03-23T18:34:57.748 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37619.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59459.log 2026-03-23T18:34:57.749 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.47058.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39339.log 2026-03-23T18:34:57.749 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47058.log.gz 2026-03-23T18:34:57.749 INFO:teuthology.orchestra.run.vm04.stderr: 26.8% -- replaced with /var/log/ceph/ceph-client.admin.37619.log.gz 2026-03-23T18:34:57.749 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5/var/log/ceph/ceph-client.admin.59459.log: --verbose -- /var/log/ceph/ceph-client.admin.38165.log 2026-03-23T18:34:57.749 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59459.log.gz 2026-03-23T18:34:57.749 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42946.log 2026-03-23T18:34:57.749 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.39339.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88175.log 2026-03-23T18:34:57.749 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.38165.log: 27.1% -- replaced with /var/log/ceph/ceph-client.admin.39339.log.gz 2026-03-23T18:34:57.749 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42946.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29333.log 2026-03-23T18:34:57.750 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42946.log.gz 2026-03-23T18:34:57.750 INFO:teuthology.orchestra.run.vm04.stderr: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.38165.log.gz 2026-03-23T18:34:57.750 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48840.log 2026-03-23T18:34:57.750 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88175.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88175.log.gz 2026-03-23T18:34:57.750 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40236.log 2026-03-23T18:34:57.750 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29333.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29333.log.gz 2026-03-23T18:34:57.750 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86149.log 2026-03-23T18:34:57.750 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.48840.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48840.log.gz 2026-03-23T18:34:57.750 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.40236.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83966.log 2026-03-23T18:34:57.750 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40236.log.gz 2026-03-23T18:34:57.751 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86149.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83901.log 2026-03-23T18:34:57.751 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86149.log.gz 2026-03-23T18:34:57.751 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51090.log 2026-03-23T18:34:57.751 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83966.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83966.log.gz 2026-03-23T18:34:57.751 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38379.log 2026-03-23T18:34:57.751 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83901.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83901.log.gz 2026-03-23T18:34:57.751 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.51090.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68780.log 2026-03-23T18:34:57.751 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51090.log.gz 2026-03-23T18:34:57.751 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35035.log 2026-03-23T18:34:57.752 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.38379.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84508.log 2026-03-23T18:34:57.752 INFO:teuthology.orchestra.run.vm04.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.38379.log.gz 2026-03-23T18:34:57.752 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68780.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68780.log.gz 2026-03-23T18:34:57.752 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89981.log 2026-03-23T18:34:57.752 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.35035.log: /var/log/ceph/ceph-client.admin.84508.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33922.log 2026-03-23T18:34:57.752 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84508.log.gz 2026-03-23T18:34:57.752 INFO:teuthology.orchestra.run.vm04.stderr: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.35035.log.gz 2026-03-23T18:34:57.752 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73280.log 2026-03-23T18:34:57.752 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89981.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89981.log.gz 2026-03-23T18:34:57.752 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84224.log 2026-03-23T18:34:57.753 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33922.log: /var/log/ceph/ceph-client.admin.73280.log: gzip -5 --verbose 0.0% -- /var/log/ceph/ceph-client.admin.66984.log -- replaced with /var/log/ceph/ceph-client.admin.73280.log.gz 2026-03-23T18:34:57.753 INFO:teuthology.orchestra.run.vm04.stderr: 2026-03-23T18:34:57.753 INFO:teuthology.orchestra.run.vm04.stderr: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.33922.log.gz 2026-03-23T18:34:57.753 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84224.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73930.log 2026-03-23T18:34:57.753 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84224.log.gz 2026-03-23T18:34:57.753 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66984.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86904.log 2026-03-23T18:34:57.753 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66984.log.gz 2026-03-23T18:34:57.753 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73930.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67293.log 2026-03-23T18:34:57.753 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73930.log.gz 2026-03-23T18:34:57.754 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86904.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29762.log 2026-03-23T18:34:57.754 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86904.log.gz 2026-03-23T18:34:57.754 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose/var/log/ceph/ceph-client.admin.67293.log: -- /var/log/ceph/ceph-client.admin.89465.log 2026-03-23T18:34:57.754 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67293.log.gz 2026-03-23T18:34:57.754 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29762.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51282.log 2026-03-23T18:34:57.754 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29762.log.gz 2026-03-23T18:34:57.754 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89465.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89465.log.gz 2026-03-23T18:34:57.755 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80074.log 2026-03-23T18:34:57.755 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.51282.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39219.log 2026-03-23T18:34:57.755 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51282.log.gz 2026-03-23T18:34:57.755 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31207.log 2026-03-23T18:34:57.755 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80074.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80074.log.gz 2026-03-23T18:34:57.755 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.39219.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28263.log 2026-03-23T18:34:57.755 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31207.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32541.log 2026-03-23T18:34:57.755 INFO:teuthology.orchestra.run.vm04.stderr: 52.6% -- replaced with /var/log/ceph/ceph-client.admin.39219.log.gz 2026-03-23T18:34:57.755 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31207.log.gz 2026-03-23T18:34:57.756 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44872.log 2026-03-23T18:34:57.756 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.28263.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28263.log.gz 2026-03-23T18:34:57.756 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81546.log 2026-03-23T18:34:57.756 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32541.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32541.log.gz 2026-03-23T18:34:57.756 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44872.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72555.log 2026-03-23T18:34:57.756 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44872.log.gz 2026-03-23T18:34:57.757 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81546.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65908.log 2026-03-23T18:34:57.757 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81546.log.gz 2026-03-23T18:34:57.757 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72555.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29913.log 2026-03-23T18:34:57.757 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72555.log.gz 2026-03-23T18:34:57.757 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65908.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72115.log 2026-03-23T18:34:57.757 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65908.log.gz 2026-03-23T18:34:57.757 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57993.log 2026-03-23T18:34:57.757 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29913.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29913.log.gz 2026-03-23T18:34:57.757 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72115.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69543.log 2026-03-23T18:34:57.757 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72115.log.gz 2026-03-23T18:34:57.757 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.57993.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75228.log 2026-03-23T18:34:57.757 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57993.log.gz 2026-03-23T18:34:57.758 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89508.log 2026-03-23T18:34:57.758 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69543.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69543.log.gz 2026-03-23T18:34:57.758 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32081.log 2026-03-23T18:34:57.758 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75228.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75228.log.gz 2026-03-23T18:34:57.758 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89508.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67358.log 2026-03-23T18:34:57.758 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89508.log.gz 2026-03-23T18:34:57.758 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35455.log 2026-03-23T18:34:57.758 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32081.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32081.log.gz 2026-03-23T18:34:57.758 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88454.log 2026-03-23T18:34:57.759 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67358.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67358.log.gz 2026-03-23T18:34:57.759 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65201.log 2026-03-23T18:34:57.759 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.35455.log: /var/log/ceph/ceph-client.admin.88454.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81524.log 2026-03-23T18:34:57.759 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88454.log.gz 2026-03-23T18:34:57.759 INFO:teuthology.orchestra.run.vm04.stderr: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.35455.log.gz 2026-03-23T18:34:57.759 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65201.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65201.log.gz 2026-03-23T18:34:57.759 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35750.log 2026-03-23T18:34:57.759 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81524.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67155.log 2026-03-23T18:34:57.760 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81524.log.gz 2026-03-23T18:34:57.760 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.35750.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38715.log 2026-03-23T18:34:57.760 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35750.log.gz 2026-03-23T18:34:57.760 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67155.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48676.log 2026-03-23T18:34:57.760 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67155.log.gz 2026-03-23T18:34:57.760 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58900.log 2026-03-23T18:34:57.760 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.38715.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30171.log 2026-03-23T18:34:57.760 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.48676.log: 27.5% -- replaced with /var/log/ceph/ceph-client.admin.38715.log.gz 2026-03-23T18:34:57.761 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48676.log.gz/var/log/ceph/ceph-client.admin.58900.log: 2026-03-23T18:34:57.761 INFO:teuthology.orchestra.run.vm04.stderr: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.58900.log.gz -5 2026-03-23T18:34:57.761 INFO:teuthology.orchestra.run.vm04.stderr: --verbose -- /var/log/ceph/ceph-client.admin.36719.log 2026-03-23T18:34:57.761 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.30171.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54464.log 2026-03-23T18:34:57.761 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30171.log.gz 2026-03-23T18:34:57.761 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34342.log 2026-03-23T18:34:57.761 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36719.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36719.log.gz 2026-03-23T18:34:57.761 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71608.log 2026-03-23T18:34:57.761 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.54464.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54464.log.gz 2026-03-23T18:34:57.761 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34342.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61961.log 2026-03-23T18:34:57.762 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71608.log: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.34342.log.gz 2026-03-23T18:34:57.762 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71608.log.gz 2026-03-23T18:34:57.762 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70710.log 2026-03-23T18:34:57.762 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61961.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67813.log 2026-03-23T18:34:57.762 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61961.log.gz 2026-03-23T18:34:57.762 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70710.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60434.log 2026-03-23T18:34:57.762 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70710.log.gz 2026-03-23T18:34:57.763 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67813.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67813.log.gz 2026-03-23T18:34:57.763 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79853.log 2026-03-23T18:34:57.763 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60434.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46185.log 2026-03-23T18:34:57.763 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60434.log.gz 2026-03-23T18:34:57.763 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79853.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39009.log 2026-03-23T18:34:57.763 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79853.log.gz 2026-03-23T18:34:57.763 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46185.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65968.log 2026-03-23T18:34:57.763 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46185.log.gz 2026-03-23T18:34:57.764 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.39009.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45755.log 2026-03-23T18:34:57.764 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39009.log.gz 2026-03-23T18:34:57.764 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47442.log 2026-03-23T18:34:57.764 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65968.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65968.log.gz 2026-03-23T18:34:57.764 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose/var/log/ceph/ceph-client.admin.45755.log: -- /var/log/ceph/ceph-client.admin.85633.log 2026-03-23T18:34:57.764 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45755.log.gz 2026-03-23T18:34:57.764 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.47442.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37514.log 2026-03-23T18:34:57.764 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47442.log.gz 2026-03-23T18:34:57.764 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35287.log 2026-03-23T18:34:57.765 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85633.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85633.log.gz 2026-03-23T18:34:57.765 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88750.log 2026-03-23T18:34:57.765 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37514.log: /var/log/ceph/ceph-client.admin.35287.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55601.log 2026-03-23T18:34:57.765 INFO:teuthology.orchestra.run.vm04.stderr: 26.8% -- replaced with /var/log/ceph/ceph-client.admin.37514.log.gz 2026-03-23T18:34:57.765 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88750.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.35287.log.gz 2026-03-23T18:34:57.765 INFO:teuthology.orchestra.run.vm04.stderr: 0.0%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27796.log 2026-03-23T18:34:57.765 INFO:teuthology.orchestra.run.vm04.stderr: -- replaced with /var/log/ceph/ceph-client.admin.88750.log.gz 2026-03-23T18:34:57.765 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.55601.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41636.log 2026-03-23T18:34:57.765 INFO:teuthology.orchestra.run.vm04.stderr: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.55601.log.gz 2026-03-23T18:34:57.766 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27796.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82806.log 2026-03-23T18:34:57.766 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27796.log.gz 2026-03-23T18:34:57.766 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47420.log 2026-03-23T18:34:57.766 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.41636.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41636.log.gz 2026-03-23T18:34:57.766 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82806.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82806.log.gz 2026-03-23T18:34:57.766 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60622.log 2026-03-23T18:34:57.766 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.47420.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76478.log 2026-03-23T18:34:57.766 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47420.log.gz 2026-03-23T18:34:57.767 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28048.log 2026-03-23T18:34:57.767 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60622.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60622.log.gz 2026-03-23T18:34:57.767 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76478.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76478.log.gz 2026-03-23T18:34:57.767 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33733.log 2026-03-23T18:34:57.767 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67379.log 2026-03-23T18:34:57.767 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.28048.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28048.log.gz 2026-03-23T18:34:57.767 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41657.log 2026-03-23T18:34:57.768 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33733.log: /var/log/ceph/ceph-client.admin.67379.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51196.log 2026-03-23T18:34:57.768 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67379.log.gz 2026-03-23T18:34:57.768 INFO:teuthology.orchestra.run.vm04.stderr: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.33733.log.gz 2026-03-23T18:34:57.768 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.41657.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41657.log.gz 2026-03-23T18:34:57.768 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70452.log 2026-03-23T18:34:57.768 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72789.log 2026-03-23T18:34:57.768 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.51196.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51196.log.gz 2026-03-23T18:34:57.768 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32371.log 2026-03-23T18:34:57.769 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70452.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70452.log.gz/var/log/ceph/ceph-client.admin.72789.log: 2026-03-23T18:34:57.769 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85131.log 2026-03-23T18:34:57.769 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72789.log.gz 2026-03-23T18:34:57.769 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32371.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70259.log 2026-03-23T18:34:57.769 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32371.log.gz 2026-03-23T18:34:57.769 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46121.log 2026-03-23T18:34:57.769 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85131.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85131.log.gz 2026-03-23T18:34:57.769 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79309.log 2026-03-23T18:34:57.769 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70259.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70259.log.gz 2026-03-23T18:34:57.769 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46121.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90488.log 2026-03-23T18:34:57.769 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46121.log.gz 2026-03-23T18:34:57.770 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26219.log 2026-03-23T18:34:57.770 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79309.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79309.log.gz 2026-03-23T18:34:57.770 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26634.log 2026-03-23T18:34:57.771 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90488.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90488.log.gz 2026-03-23T18:34:57.771 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81653.log 2026-03-23T18:34:57.771 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.26219.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26219.log.gz 2026-03-23T18:34:57.771 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.26634.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70388.log 2026-03-23T18:34:57.771 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26634.log.gz 2026-03-23T18:34:57.771 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81653.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51798.log 2026-03-23T18:34:57.771 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81653.log.gz 2026-03-23T18:34:57.771 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64612.log 2026-03-23T18:34:57.771 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70388.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70388.log.gz 2026-03-23T18:34:57.771 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58570.log 2026-03-23T18:34:57.771 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.51798.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51798.log.gz 2026-03-23T18:34:57.771 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64612.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63239.log 2026-03-23T18:34:57.771 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64612.log.gz 2026-03-23T18:34:57.771 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58570.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88068.log 2026-03-23T18:34:57.772 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58570.log.gz 2026-03-23T18:34:57.772 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25504.log 2026-03-23T18:34:57.772 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63239.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63239.log.gz 2026-03-23T18:34:57.772 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47485.log 2026-03-23T18:34:57.772 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88068.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88068.log.gz 2026-03-23T18:34:57.772 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.25504.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61617.log 2026-03-23T18:34:57.772 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25504.log.gz 2026-03-23T18:34:57.772 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose/var/log/ceph/ceph-client.admin.47485.log: -- /var/log/ceph/ceph-client.admin.81288.log 2026-03-23T18:34:57.772 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47485.log.gz 2026-03-23T18:34:57.772 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27588.log 2026-03-23T18:34:57.773 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61617.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61617.log.gz 2026-03-23T18:34:57.773 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81288.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48655.log 2026-03-23T18:34:57.773 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81288.log.gz 2026-03-23T18:34:57.773 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27588.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37556.log 2026-03-23T18:34:57.773 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27588.log.gz 2026-03-23T18:34:57.773 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57929.log 2026-03-23T18:34:57.773 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.48655.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48655.log.gz 2026-03-23T18:34:57.773 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63984.log 2026-03-23T18:34:57.773 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37556.log: /var/log/ceph/ceph-client.admin.57929.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90196.log 2026-03-23T18:34:57.773 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57929.log.gz 2026-03-23T18:34:57.774 INFO:teuthology.orchestra.run.vm04.stderr: 26.8% -- replaced with /var/log/ceph/ceph-client.admin.37556.log.gz 2026-03-23T18:34:57.774 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63984.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63984.log.gz 2026-03-23T18:34:57.774 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53804.log 2026-03-23T18:34:57.774 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90196.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38631.log 2026-03-23T18:34:57.774 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90196.log.gz 2026-03-23T18:34:57.774 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.53804.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53804.log.gz 2026-03-23T18:34:57.774 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87132.log 2026-03-23T18:34:57.775 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.38631.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86345.log 2026-03-23T18:34:57.775 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26741.log 2026-03-23T18:34:57.775 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87132.log: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.38631.log.gz 2026-03-23T18:34:57.775 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87132.log.gz/var/log/ceph/ceph-client.admin.86345.log: 2026-03-23T18:34:57.775 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67091.log 2026-03-23T18:34:57.775 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86345.log.gz 2026-03-23T18:34:57.775 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62621.log 2026-03-23T18:34:57.775 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.26741.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26741.log.gz 2026-03-23T18:34:57.775 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67091.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76886.log 2026-03-23T18:34:57.776 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67091.log.gz 2026-03-23T18:34:57.776 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62621.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78760.log 2026-03-23T18:34:57.776 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62621.log.gz 2026-03-23T18:34:57.776 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76886.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44814.log 2026-03-23T18:34:57.776 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76886.log.gz 2026-03-23T18:34:57.776 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78760.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59072.log 2026-03-23T18:34:57.776 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78760.log.gz 2026-03-23T18:34:57.776 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28937.log 2026-03-23T18:34:57.776 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44814.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44814.log.gz 2026-03-23T18:34:57.777 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87832.log 2026-03-23T18:34:57.777 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59072.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59072.log.gz 2026-03-23T18:34:57.777 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.28937.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58503.log 2026-03-23T18:34:57.777 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28937.log.gz 2026-03-23T18:34:57.777 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85419.log 2026-03-23T18:34:57.777 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87832.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87832.log.gz 2026-03-23T18:34:57.777 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71565.log 2026-03-23T18:34:57.777 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58503.log: /var/log/ceph/ceph-client.admin.85419.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54747.log 2026-03-23T18:34:57.777 INFO:teuthology.orchestra.run.vm04.stderr: 50.8% -- replaced with /var/log/ceph/ceph-client.admin.58503.log.gz 2026-03-23T18:34:57.778 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85419.log.gz 2026-03-23T18:34:57.778 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71565.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39093.log 2026-03-23T18:34:57.778 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71565.log.gz 2026-03-23T18:34:57.778 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77331.log/var/log/ceph/ceph-client.admin.54747.log: 2026-03-23T18:34:57.778 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54747.log.gz 2026-03-23T18:34:57.778 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33147.log 2026-03-23T18:34:57.778 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.39093.log: /var/log/ceph/ceph-client.admin.77331.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78108.log 2026-03-23T18:34:57.778 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77331.log.gz 2026-03-23T18:34:57.778 INFO:teuthology.orchestra.run.vm04.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.39093.log.gz 2026-03-23T18:34:57.778 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33147.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33147.log.gz 2026-03-23T18:34:57.779 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40322.log 2026-03-23T18:34:57.779 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78108.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58500.log 2026-03-23T18:34:57.779 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78108.log.gz 2026-03-23T18:34:57.779 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.40322.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46688.log 2026-03-23T18:34:57.779 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40322.log.gz 2026-03-23T18:34:57.779 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60002.log 2026-03-23T18:34:57.779 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58500.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58500.log.gz 2026-03-23T18:34:57.779 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30603.log 2026-03-23T18:34:57.780 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46688.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46688.log.gz 2026-03-23T18:34:57.780 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60002.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62969.log 2026-03-23T18:34:57.780 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60002.log.gz 2026-03-23T18:34:57.780 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74246.log/var/log/ceph/ceph-client.admin.30603.log: 2026-03-23T18:34:57.780 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30603.log.gz 2026-03-23T18:34:57.780 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45688.log 2026-03-23T18:34:57.780 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62969.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62969.log.gz 2026-03-23T18:34:57.780 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79089.log 2026-03-23T18:34:57.780 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74246.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74246.log.gz 2026-03-23T18:34:57.780 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.45688.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61896.log 2026-03-23T18:34:57.780 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45688.log.gz 2026-03-23T18:34:57.781 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79089.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38291.log 2026-03-23T18:34:57.781 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79089.log.gz 2026-03-23T18:34:57.781 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59743.log 2026-03-23T18:34:57.781 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61896.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61896.log.gz 2026-03-23T18:34:57.781 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88310.log 2026-03-23T18:34:57.781 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.38291.log: /var/log/ceph/ceph-client.admin.59743.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47721.log 2026-03-23T18:34:57.781 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59743.log.gz 2026-03-23T18:34:57.781 INFO:teuthology.orchestra.run.vm04.stderr: 50.4% -- replaced with /var/log/ceph/ceph-client.admin.38291.log.gz 2026-03-23T18:34:57.782 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88310.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88310.log.gz 2026-03-23T18:34:57.782 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62872.log 2026-03-23T18:34:57.782 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.47721.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62283.log 2026-03-23T18:34:57.782 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47721.log.gz 2026-03-23T18:34:57.782 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62872.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33045.log 2026-03-23T18:34:57.782 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62872.log.gz 2026-03-23T18:34:57.782 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62283.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62283.log.gz 2026-03-23T18:34:57.782 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48590.log 2026-03-23T18:34:57.783 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91695.log 2026-03-23T18:34:57.783 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33045.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33045.log.gz 2026-03-23T18:34:57.783 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61294.log 2026-03-23T18:34:57.783 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.48590.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48590.log.gz 2026-03-23T18:34:57.783 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91695.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76456.log 2026-03-23T18:34:57.783 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91695.log.gz 2026-03-23T18:34:57.783 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28585.log 2026-03-23T18:34:57.783 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61294.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61294.log.gz 2026-03-23T18:34:57.783 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72436.log 2026-03-23T18:34:57.784 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76456.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76456.log.gz 2026-03-23T18:34:57.784 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30989.log 2026-03-23T18:34:57.784 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.28585.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28585.log.gz 2026-03-23T18:34:57.784 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72436.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48548.log 2026-03-23T18:34:57.784 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72436.log.gz 2026-03-23T18:34:57.784 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.30989.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66167.log 2026-03-23T18:34:57.784 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30989.log.gz 2026-03-23T18:34:57.784 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43870.log 2026-03-23T18:34:57.784 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.48548.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48548.log.gz 2026-03-23T18:34:57.784 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54127.log 2026-03-23T18:34:57.785 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66167.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66167.log.gz 2026-03-23T18:34:57.785 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43870.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31250.log 2026-03-23T18:34:57.785 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.54127.log: 52.8% -- replaced with /var/log/ceph/ceph-client.admin.43870.log.gz 2026-03-23T18:34:57.785 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54127.log.gz 2026-03-23T18:34:57.785 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28853.log 2026-03-23T18:34:57.785 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31250.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80944.log 2026-03-23T18:34:57.785 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31250.log.gz 2026-03-23T18:34:57.785 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28693.log 2026-03-23T18:34:57.786 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.28853.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28853.log.gz 2026-03-23T18:34:57.786 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65047.log 2026-03-23T18:34:57.786 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80944.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80944.log.gz 2026-03-23T18:34:57.786 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.28693.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73757.log 2026-03-23T18:34:57.786 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28693.log.gz 2026-03-23T18:34:57.786 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42886.log 2026-03-23T18:34:57.786 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65047.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65047.log.gz 2026-03-23T18:34:57.786 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73757.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58793.log 2026-03-23T18:34:57.786 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73757.log.gz 2026-03-23T18:34:57.787 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42886.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71307.log 2026-03-23T18:34:57.787 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42886.log.gz 2026-03-23T18:34:57.787 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44609.log 2026-03-23T18:34:57.787 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58793.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73559.log 2026-03-23T18:34:57.787 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71307.log: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.58793.log.gz 2026-03-23T18:34:57.787 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71307.log.gz 2026-03-23T18:34:57.787 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44609.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39634.log 2026-03-23T18:34:57.787 INFO:teuthology.orchestra.run.vm04.stderr: 25.7% -- replaced with /var/log/ceph/ceph-client.admin.44609.log.gz 2026-03-23T18:34:57.787 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73559.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47592.log 2026-03-23T18:34:57.787 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73559.log.gz 2026-03-23T18:34:57.788 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68350.log 2026-03-23T18:34:57.788 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.39634.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39634.log.gz 2026-03-23T18:34:57.788 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.47592.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79290.log 2026-03-23T18:34:57.788 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47592.log.gz 2026-03-23T18:34:57.788 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68350.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28112.log 2026-03-23T18:34:57.788 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68350.log.gz 2026-03-23T18:34:57.788 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39114.log 2026-03-23T18:34:57.788 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79290.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79290.log.gz 2026-03-23T18:34:57.788 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26093.log 2026-03-23T18:34:57.789 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.28112.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28112.log.gz 2026-03-23T18:34:57.789 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.39114.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34216.log 2026-03-23T18:34:57.789 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.26093.log: 27.2% -- replaced with /var/log/ceph/ceph-client.admin.39114.log.gz 2026-03-23T18:34:57.789 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26093.log.gz 2026-03-23T18:34:57.789 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32943.log 2026-03-23T18:34:57.789 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34216.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83379.log 2026-03-23T18:34:57.789 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34216.log.gz 2026-03-23T18:34:57.789 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32943.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28478.log 2026-03-23T18:34:57.790 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32943.log.gz 2026-03-23T18:34:57.790 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50881.log 2026-03-23T18:34:57.790 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83379.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83379.log.gz 2026-03-23T18:34:57.790 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.28478.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57078.log 2026-03-23T18:34:57.790 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28478.log.gz 2026-03-23T18:34:57.790 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32473.log 2026-03-23T18:34:57.790 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50881.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50881.log.gz 2026-03-23T18:34:57.790 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.57078.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80769.log 2026-03-23T18:34:57.791 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55262.log 2026-03-23T18:34:57.791 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57078.log.gz 2026-03-23T18:34:57.791 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78086.log 2026-03-23T18:34:57.791 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80769.log: /var/log/ceph/ceph-client.admin.32473.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80769.log.gz 2026-03-23T18:34:57.791 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32473.log.gz 2026-03-23T18:34:57.791 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.55262.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68006.log 2026-03-23T18:34:57.791 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55262.log.gz 2026-03-23T18:34:57.791 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75938.log 2026-03-23T18:34:57.791 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78086.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78086.log.gz 2026-03-23T18:34:57.791 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68006.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86517.log 2026-03-23T18:34:57.791 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68006.log.gz 2026-03-23T18:34:57.792 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75938.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34657.log 2026-03-23T18:34:57.792 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75938.log.gz 2026-03-23T18:34:57.792 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54105.log 2026-03-23T18:34:57.792 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86517.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86517.log.gz 2026-03-23T18:34:57.792 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34657.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66185.log 2026-03-23T18:34:57.792 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.54105.log: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.34657.log.gz 2026-03-23T18:34:57.792 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54105.log.gz 2026-03-23T18:34:57.792 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27070.log 2026-03-23T18:34:57.793 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66185.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33368.log 2026-03-23T18:34:57.793 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66185.log.gz 2026-03-23T18:34:57.793 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59725.log/var/log/ceph/ceph-client.admin.27070.log: 2026-03-23T18:34:57.793 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27070.log.gz 2026-03-23T18:34:57.793 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28542.log 2026-03-23T18:34:57.793 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33368.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33368.log.gz 2026-03-23T18:34:57.793 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32790.log 2026-03-23T18:34:57.793 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59725.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59725.log.gz 2026-03-23T18:34:57.793 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.28542.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79347.log 2026-03-23T18:34:57.793 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28542.log.gz 2026-03-23T18:34:57.794 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78657.log 2026-03-23T18:34:57.794 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32790.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32790.log.gz 2026-03-23T18:34:57.794 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51968.log 2026-03-23T18:34:57.794 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79347.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79347.log.gz 2026-03-23T18:34:57.794 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78657.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78657.log.gzgzip 2026-03-23T18:34:57.794 INFO:teuthology.orchestra.run.vm04.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.69395.log 2026-03-23T18:34:57.794 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64734.log 2026-03-23T18:34:57.795 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.51968.log: /var/log/ceph/ceph-client.admin.69395.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78918.log 2026-03-23T18:34:57.795 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51968.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69395.log.gz 2026-03-23T18:34:57.795 INFO:teuthology.orchestra.run.vm04.stderr: 2026-03-23T18:34:57.795 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64734.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90218.log 2026-03-23T18:34:57.795 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64734.log.gz 2026-03-23T18:34:57.795 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81632.log 2026-03-23T18:34:57.795 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78918.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78918.log.gz 2026-03-23T18:34:57.795 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50966.log 2026-03-23T18:34:57.795 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90218.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90218.log.gz 2026-03-23T18:34:57.795 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81632.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37331.log 2026-03-23T18:34:57.795 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81632.log.gz 2026-03-23T18:34:57.796 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50966.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88550.log 2026-03-23T18:34:57.796 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50966.log.gz 2026-03-23T18:34:57.796 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50246.log 2026-03-23T18:34:57.796 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37331.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37331.log.gz 2026-03-23T18:34:57.796 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38967.log 2026-03-23T18:34:57.796 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88550.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88550.log.gz 2026-03-23T18:34:57.796 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50246.log: gzip 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50246.log.gz -5 2026-03-23T18:34:57.796 INFO:teuthology.orchestra.run.vm04.stderr: --verbose -- /var/log/ceph/ceph-client.admin.37382.log 2026-03-23T18:34:57.796 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.38967.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55988.log 2026-03-23T18:34:57.797 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85271.log 2026-03-23T18:34:57.797 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37382.log: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.38967.log.gz 2026-03-23T18:34:57.797 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37382.log.gz 2026-03-23T18:34:57.797 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75916.log 2026-03-23T18:34:57.797 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.55988.log: /var/log/ceph/ceph-client.admin.85271.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55988.log.gz 2026-03-23T18:34:57.797 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85271.log.gz 2026-03-23T18:34:57.797 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78459.log 2026-03-23T18:34:57.797 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38249.log 2026-03-23T18:34:57.797 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75916.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75916.log.gz 2026-03-23T18:34:57.797 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48326.log 2026-03-23T18:34:57.798 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78459.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78459.log.gz 2026-03-23T18:34:57.798 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.38249.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65827.log 2026-03-23T18:34:57.798 INFO:teuthology.orchestra.run.vm04.stderr: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.38249.log.gz 2026-03-23T18:34:57.798 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.48326.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48326.log.gz 2026-03-23T18:34:57.798 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28026.log 2026-03-23T18:34:57.798 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65827.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60278.log 2026-03-23T18:34:57.798 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65827.log.gz 2026-03-23T18:34:57.798 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58346.log 2026-03-23T18:34:57.798 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.28026.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28026.log.gz 2026-03-23T18:34:57.799 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79159.log 2026-03-23T18:34:57.799 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60278.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60278.log.gz 2026-03-23T18:34:57.799 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58346.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38652.log 2026-03-23T18:34:57.799 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58346.log.gz 2026-03-23T18:34:57.799 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79159.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32705.log 2026-03-23T18:34:57.799 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79159.log.gz 2026-03-23T18:34:57.799 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43113.log 2026-03-23T18:34:57.799 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.38652.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34048.log 2026-03-23T18:34:57.800 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32705.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32705.log.gz 2026-03-23T18:34:57.800 INFO:teuthology.orchestra.run.vm04.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.38652.log.gz 2026-03-23T18:34:57.800 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43113.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38568.log 2026-03-23T18:34:57.800 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43113.log.gz 2026-03-23T18:34:57.800 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34048.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69791.log 2026-03-23T18:34:57.800 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34405.log 2026-03-23T18:34:57.800 INFO:teuthology.orchestra.run.vm04.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.34048.log.gz 2026-03-23T18:34:57.800 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.38568.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63822.log 2026-03-23T18:34:57.801 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69791.log: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.38568.log.gz 2026-03-23T18:34:57.801 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34405.log: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.69791.log.gzgzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49988.log 2026-03-23T18:34:57.801 INFO:teuthology.orchestra.run.vm04.stderr: 2026-03-23T18:34:57.801 INFO:teuthology.orchestra.run.vm04.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.34405.log.gz 2026-03-23T18:34:57.801 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63822.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63822.log.gz 2026-03-23T18:34:57.801 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89487.log 2026-03-23T18:34:57.801 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49988.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38589.log 2026-03-23T18:34:57.801 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49988.log.gz 2026-03-23T18:34:57.801 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60518.log 2026-03-23T18:34:57.801 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89487.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89487.log.gz 2026-03-23T18:34:57.802 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42384.log 2026-03-23T18:34:57.802 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.38589.log: /var/log/ceph/ceph-client.admin.60518.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70538.log 2026-03-23T18:34:57.802 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60518.log.gz 2026-03-23T18:34:57.802 INFO:teuthology.orchestra.run.vm04.stderr: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.38589.log.gz 2026-03-23T18:34:57.802 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59963.log 2026-03-23T18:34:57.802 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42384.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49750.log 2026-03-23T18:34:57.802 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70538.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.42384.log.gz 2026-03-23T18:34:57.803 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70538.log.gz 2026-03-23T18:34:57.803 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59963.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59963.log.gz 2026-03-23T18:34:57.803 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48716.log 2026-03-23T18:34:57.803 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28349.log 2026-03-23T18:34:57.803 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49750.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49750.log.gz 2026-03-23T18:34:57.803 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45841.log 2026-03-23T18:34:57.803 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.48716.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48716.log.gz 2026-03-23T18:34:57.803 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.28349.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28349.log.gz 2026-03-23T18:34:57.803 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73409.log 2026-03-23T18:34:57.803 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41995.log 2026-03-23T18:34:57.804 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.45841.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45841.log.gz 2026-03-23T18:34:57.804 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73409.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81116.log 2026-03-23T18:34:57.804 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73409.log.gz 2026-03-23T18:34:57.804 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.41995.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41995.log.gz 2026-03-23T18:34:57.804 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36209.log 2026-03-23T18:34:57.804 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76002.log 2026-03-23T18:34:57.804 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81116.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81116.log.gz 2026-03-23T18:34:57.804 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36209.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40150.log 2026-03-23T18:34:57.805 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36209.log.gz 2026-03-23T18:34:57.805 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76002.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76002.log.gz 2026-03-23T18:34:57.805 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76110.log 2026-03-23T18:34:57.805 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37724.log 2026-03-23T18:34:57.805 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.40150.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40150.log.gz 2026-03-23T18:34:57.805 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76110.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29161.log 2026-03-23T18:34:57.805 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76110.log.gz 2026-03-23T18:34:57.805 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37724.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60474.log 2026-03-23T18:34:57.805 INFO:teuthology.orchestra.run.vm04.stderr: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.37724.log.gz 2026-03-23T18:34:57.806 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29161.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33555.log 2026-03-23T18:34:57.806 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29161.log.gz 2026-03-23T18:34:57.806 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88370.log 2026-03-23T18:34:57.806 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60474.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60474.log.gz 2026-03-23T18:34:57.806 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33555.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33555.log.gz 2026-03-23T18:34:57.806 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79715.log 2026-03-23T18:34:57.807 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88370.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72217.log 2026-03-23T18:34:57.807 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88370.log.gz 2026-03-23T18:34:57.807 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79715.log: 0.0%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47041.log 2026-03-23T18:34:57.807 INFO:teuthology.orchestra.run.vm04.stderr: -- replaced with /var/log/ceph/ceph-client.admin.79715.log.gz 2026-03-23T18:34:57.807 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72217.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71350.log 2026-03-23T18:34:57.807 INFO:teuthology.orchestra.run.vm04.stderr: 57.9% -- replaced with /var/log/ceph/ceph-client.admin.72217.log.gz 2026-03-23T18:34:57.807 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.47041.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53589.log 2026-03-23T18:34:57.807 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47041.log.gz 2026-03-23T18:34:57.808 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71350.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86797.log 2026-03-23T18:34:57.808 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71350.log.gz 2026-03-23T18:34:57.808 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.53589.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53589.log.gz 2026-03-23T18:34:57.808 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89809.log 2026-03-23T18:34:57.808 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73000.log 2026-03-23T18:34:57.808 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86797.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86797.log.gz 2026-03-23T18:34:57.808 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39999.log 2026-03-23T18:34:57.808 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89809.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89809.log.gz 2026-03-23T18:34:57.808 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73000.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73000.log.gz 2026-03-23T18:34:57.808 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86259.log 2026-03-23T18:34:57.808 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50117.log 2026-03-23T18:34:57.809 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.39999.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39999.log.gz 2026-03-23T18:34:57.809 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86259.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63197.log 2026-03-23T18:34:57.809 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86259.log.gz 2026-03-23T18:34:57.809 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50117.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50117.log.gz 2026-03-23T18:34:57.809 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69583.log 2026-03-23T18:34:57.809 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86883.log 2026-03-23T18:34:57.809 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63197.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63197.log.gz 2026-03-23T18:34:57.809 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69583.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87037.log 2026-03-23T18:34:57.810 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69583.log.gz 2026-03-23T18:34:57.810 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86883.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86883.log.gz 2026-03-23T18:34:57.810 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46741.log 2026-03-23T18:34:57.810 INFO:teuthology.orchestra.run.vm04.stderr:gzip/var/log/ceph/ceph-client.admin.87037.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.54935.log 2026-03-23T18:34:57.810 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87037.log.gz 2026-03-23T18:34:57.810 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27961.log 2026-03-23T18:34:57.811 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46741.log: /var/log/ceph/ceph-client.admin.54935.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54935.log.gz 2026-03-23T18:34:57.811 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46741.log.gz 2026-03-23T18:34:57.811 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59201.log 2026-03-23T18:34:57.811 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82785.log 2026-03-23T18:34:57.811 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27961.log: /var/log/ceph/ceph-client.admin.59201.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44110.log 2026-03-23T18:34:57.811 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59201.log.gz 2026-03-23T18:34:57.811 INFO:teuthology.orchestra.run.vm04.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.27961.log.gz 2026-03-23T18:34:57.811 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82785.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82785.log.gz 2026-03-23T18:34:57.811 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64851.log 2026-03-23T18:34:57.812 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44110.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72675.log 2026-03-23T18:34:57.812 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44110.log.gz 2026-03-23T18:34:57.812 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68888.log 2026-03-23T18:34:57.812 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64851.log: /var/log/ceph/ceph-client.admin.72675.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70732.log 2026-03-23T18:34:57.812 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72675.log.gz 2026-03-23T18:34:57.812 INFO:teuthology.orchestra.run.vm04.stderr: 29.0% -- replaced with /var/log/ceph/ceph-client.admin.64851.log.gz 2026-03-23T18:34:57.812 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68028.log 2026-03-23T18:34:57.812 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68888.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68888.log.gz 2026-03-23T18:34:57.812 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71415.log 2026-03-23T18:34:57.813 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70732.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70732.log.gz 2026-03-23T18:34:57.813 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68028.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68028.log.gz 2026-03-23T18:34:57.813 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38270.log 2026-03-23T18:34:57.813 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71415.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45495.log 2026-03-23T18:34:57.813 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71415.log.gz 2026-03-23T18:34:57.813 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38505.log 2026-03-23T18:34:57.813 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.38270.log: /var/log/ceph/ceph-client.admin.45495.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27219.log 2026-03-23T18:34:57.813 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45495.log.gz 2026-03-23T18:34:57.813 INFO:teuthology.orchestra.run.vm04.stderr: 26.8% -- replaced with /var/log/ceph/ceph-client.admin.38270.log.gz 2026-03-23T18:34:57.813 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76607.log 2026-03-23T18:34:57.814 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.38505.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49191.log 2026-03-23T18:34:57.814 INFO:teuthology.orchestra.run.vm04.stderr: 26.5%/var/log/ceph/ceph-client.admin.27219.log: -- replaced with /var/log/ceph/ceph-client.admin.38505.log.gz 2026-03-23T18:34:57.814 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27219.log.gz 2026-03-23T18:34:57.814 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76607.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53783.log 2026-03-23T18:34:57.814 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76607.log.gz 2026-03-23T18:34:57.814 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49191.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79367.log 2026-03-23T18:34:57.814 INFO:teuthology.orchestra.run.vm04.stderr: 27.1% -- replaced with /var/log/ceph/ceph-client.admin.49191.log.gz 2026-03-23T18:34:57.814 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46863.log 2026-03-23T18:34:57.815 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.53783.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53783.log.gz 2026-03-23T18:34:57.815 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79367.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42724.log 2026-03-23T18:34:57.815 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79367.log.gz 2026-03-23T18:34:57.815 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46863.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79444.log 2026-03-23T18:34:57.815 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46863.log.gz 2026-03-23T18:34:57.815 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49879.log 2026-03-23T18:34:57.815 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42724.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59029.log 2026-03-23T18:34:57.815 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79444.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79444.log.gz 2026-03-23T18:34:57.815 INFO:teuthology.orchestra.run.vm04.stderr: 11.1% -- replaced with /var/log/ceph/ceph-client.admin.42724.log.gz 2026-03-23T18:34:57.815 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49879.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57036.log 2026-03-23T18:34:57.815 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49879.log.gz 2026-03-23T18:34:57.816 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59029.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89573.log 2026-03-23T18:34:57.816 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59029.log.gz 2026-03-23T18:34:57.816 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62305.log 2026-03-23T18:34:57.816 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.57036.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57036.log.gz 2026-03-23T18:34:57.816 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89573.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87875.log 2026-03-23T18:34:57.816 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89573.log.gz 2026-03-23T18:34:57.816 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62305.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29484.log 2026-03-23T18:34:57.816 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62305.log.gz 2026-03-23T18:34:57.816 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29655.log 2026-03-23T18:34:57.817 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87875.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87875.log.gz 2026-03-23T18:34:57.817 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67048.log 2026-03-23T18:34:57.817 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29484.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29484.log.gz 2026-03-23T18:34:57.817 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29655.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80533.log 2026-03-23T18:34:57.817 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29655.log.gz 2026-03-23T18:34:57.817 INFO:teuthology.orchestra.run.vm04.stderr:gzip/var/log/ceph/ceph-client.admin.67048.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.28521.log 2026-03-23T18:34:57.817 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67048.log.gz 2026-03-23T18:34:57.817 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79033.log 2026-03-23T18:34:57.817 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80533.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80533.log.gz 2026-03-23T18:34:57.817 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83751.log 2026-03-23T18:34:57.818 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.28521.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28521.log.gz 2026-03-23T18:34:57.818 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79033.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38463.log 2026-03-23T18:34:57.818 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79033.log.gz 2026-03-23T18:34:57.818 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83751.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32439.log 2026-03-23T18:34:57.818 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83751.log.gz 2026-03-23T18:34:57.818 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71844.log 2026-03-23T18:34:57.818 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.38463.log: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.38463.log.gz 2026-03-23T18:34:57.819 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32439.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32320.log 2026-03-23T18:34:57.819 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32439.log.gz 2026-03-23T18:34:57.819 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71844.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34237.log 2026-03-23T18:34:57.819 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71844.log.gz 2026-03-23T18:34:57.819 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32320.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70087.log 2026-03-23T18:34:57.819 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32320.log.gz 2026-03-23T18:34:57.819 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31680.log 2026-03-23T18:34:57.819 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34237.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34237.log.gz 2026-03-23T18:34:57.819 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70087.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62262.log 2026-03-23T18:34:57.819 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70087.log.gz 2026-03-23T18:34:57.820 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62369.log 2026-03-23T18:34:57.820 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31680.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31680.log.gz 2026-03-23T18:34:57.820 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose --/var/log/ceph/ceph-client.admin.62262.log: /var/log/ceph/ceph-client.admin.81266.log 2026-03-23T18:34:57.820 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62262.log.gz 2026-03-23T18:34:57.820 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62369.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76693.log 2026-03-23T18:34:57.820 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62369.log.gz 2026-03-23T18:34:57.820 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81266.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34027.log 2026-03-23T18:34:57.820 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81266.log.gz 2026-03-23T18:34:57.820 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79785.log 2026-03-23T18:34:57.821 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76693.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76693.log.gz 2026-03-23T18:34:57.821 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34027.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53525.log 2026-03-23T18:34:57.821 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79785.log: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.34027.log.gz 2026-03-23T18:34:57.821 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79785.log.gz 2026-03-23T18:34:57.821 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46422.log 2026-03-23T18:34:57.821 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.53525.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67834.log 2026-03-23T18:34:57.821 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53525.log.gz 2026-03-23T18:34:57.821 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46422.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61144.log 2026-03-23T18:34:57.822 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46422.log.gz 2026-03-23T18:34:57.822 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30536.log 2026-03-23T18:34:57.822 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67834.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67834.log.gz 2026-03-23T18:34:57.822 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61144.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57585.log 2026-03-23T18:34:57.822 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61144.log.gz 2026-03-23T18:34:57.822 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.30536.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80987.log 2026-03-23T18:34:57.822 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30536.log.gz 2026-03-23T18:34:57.822 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88820.log 2026-03-23T18:34:57.823 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.57585.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57585.log.gz 2026-03-23T18:34:57.823 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80987.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58079.log 2026-03-23T18:34:57.823 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80987.log.gz 2026-03-23T18:34:57.823 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88820.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76972.log 2026-03-23T18:34:57.823 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88820.log.gz 2026-03-23T18:34:57.823 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58079.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89852.log 2026-03-23T18:34:57.823 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58079.log.gz 2026-03-23T18:34:57.823 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47549.log 2026-03-23T18:34:57.823 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76972.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76972.log.gz 2026-03-23T18:34:57.823 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89852.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43651.log 2026-03-23T18:34:57.823 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89852.log.gz 2026-03-23T18:34:57.824 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.47549.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79979.log 2026-03-23T18:34:57.824 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47549.log.gz 2026-03-23T18:34:57.824 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71372.log 2026-03-23T18:34:57.824 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43651.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43651.log.gz 2026-03-23T18:34:57.824 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44471.log 2026-03-23T18:34:57.824 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79979.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79979.log.gz 2026-03-23T18:34:57.824 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71372.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77037.log 2026-03-23T18:34:57.824 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71372.log.gz 2026-03-23T18:34:57.825 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44471.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72355.log 2026-03-23T18:34:57.825 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44471.log.gz 2026-03-23T18:34:57.825 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90024.log 2026-03-23T18:34:57.825 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77037.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77037.log.gz 2026-03-23T18:34:57.825 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82419.log 2026-03-23T18:34:57.825 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72355.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72355.log.gz 2026-03-23T18:34:57.825 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90024.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42304.log 2026-03-23T18:34:57.825 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90024.log.gz 2026-03-23T18:34:57.825 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82419.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26397.log 2026-03-23T18:34:57.825 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82419.log.gz 2026-03-23T18:34:57.826 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36124.log 2026-03-23T18:34:57.826 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42304.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42304.log.gz 2026-03-23T18:34:57.826 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80877.log 2026-03-23T18:34:57.826 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.26397.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26397.log.gz 2026-03-23T18:34:57.826 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36124.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70431.log 2026-03-23T18:34:57.826 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36124.log.gz 2026-03-23T18:34:57.827 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75109.log 2026-03-23T18:34:57.827 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62197.log 2026-03-23T18:34:57.827 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80877.log: /var/log/ceph/ceph-client.admin.70431.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80877.log.gz 0.0% 2026-03-23T18:34:57.827 INFO:teuthology.orchestra.run.vm04.stderr: -- replaced with /var/log/ceph/ceph-client.admin.70431.log.gz 2026-03-23T18:34:57.827 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75109.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75109.log.gz 2026-03-23T18:34:57.827 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77721.log 2026-03-23T18:34:57.827 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71651.log 2026-03-23T18:34:57.827 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62197.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62197.log.gz 2026-03-23T18:34:57.827 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78408.log 2026-03-23T18:34:57.828 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77721.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77721.log.gz 2026-03-23T18:34:57.828 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71651.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32269.log 2026-03-23T18:34:57.828 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71651.log.gz 2026-03-23T18:34:57.828 INFO:teuthology.orchestra.run.vm04.stderr:gzip/var/log/ceph/ceph-client.admin.78408.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.35266.log 2026-03-23T18:34:57.828 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78408.log.gz 2026-03-23T18:34:57.828 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77678.log 2026-03-23T18:34:57.828 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32269.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32269.log.gz 2026-03-23T18:34:57.828 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89207.log 2026-03-23T18:34:57.828 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.35266.log: /var/log/ceph/ceph-client.admin.77678.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35434.log 2026-03-23T18:34:57.828 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77678.log.gz 2026-03-23T18:34:57.829 INFO:teuthology.orchestra.run.vm04.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.35266.log.gz 2026-03-23T18:34:57.829 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70985.log 2026-03-23T18:34:57.829 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89207.log: 25.8% -- replaced with /var/log/ceph/ceph-client.admin.89207.log.gz 2026-03-23T18:34:57.829 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36396.log 2026-03-23T18:34:57.829 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.35434.log: /var/log/ceph/ceph-client.admin.70985.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70985.log.gz 2026-03-23T18:34:57.829 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72914.log 2026-03-23T18:34:57.829 INFO:teuthology.orchestra.run.vm04.stderr: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.35434.log.gz 2026-03-23T18:34:57.829 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51261.log 2026-03-23T18:34:57.829 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36396.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36396.log.gz 2026-03-23T18:34:57.829 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72914.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.72914.log.gz -5 2026-03-23T18:34:57.830 INFO:teuthology.orchestra.run.vm04.stderr: --verbose -- /var/log/ceph/ceph-client.admin.82978.log 2026-03-23T18:34:57.830 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.51261.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90089.log 2026-03-23T18:34:57.830 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51261.log.gz 2026-03-23T18:34:57.830 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58101.log 2026-03-23T18:34:57.830 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82978.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82978.log.gz/var/log/ceph/ceph-client.admin.90089.log: 2026-03-23T18:34:57.830 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54726.log 2026-03-23T18:34:57.830 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90089.log.gz 2026-03-23T18:34:57.830 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58101.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58101.log.gz 2026-03-23T18:34:57.830 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90067.log 2026-03-23T18:34:57.831 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44409.log 2026-03-23T18:34:57.831 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.54726.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54726.log.gz 2026-03-23T18:34:57.831 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79213.log 2026-03-23T18:34:57.831 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90067.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90067.log.gz 2026-03-23T18:34:57.831 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44409.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41120.log 2026-03-23T18:34:57.831 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44409.log.gz 2026-03-23T18:34:57.831 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34720.log 2026-03-23T18:34:57.831 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79213.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79213.log.gz 2026-03-23T18:34:57.831 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28134.log 2026-03-23T18:34:57.832 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.41120.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41120.log.gz 2026-03-23T18:34:57.832 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57308.log 2026-03-23T18:34:57.832 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34720.log: /var/log/ceph/ceph-client.admin.28134.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62025.log 2026-03-23T18:34:57.832 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28134.log.gz 2026-03-23T18:34:57.832 INFO:teuthology.orchestra.run.vm04.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.34720.log.gz 2026-03-23T18:34:57.832 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46400.log 2026-03-23T18:34:57.832 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.57308.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57308.log.gz 2026-03-23T18:34:57.832 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81717.log 2026-03-23T18:34:57.832 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62025.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62025.log.gz 2026-03-23T18:34:57.833 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46400.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46400.log.gz 2026-03-23T18:34:57.833 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80511.log 2026-03-23T18:34:57.833 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61552.log 2026-03-23T18:34:57.833 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81717.log: /var/log/ceph/ceph-client.admin.80511.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67212.log 2026-03-23T18:34:57.833 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80511.log.gz 2026-03-23T18:34:57.833 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81717.log.gz 2026-03-23T18:34:57.833 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61552.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45559.log 2026-03-23T18:34:57.833 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61552.log.gz 2026-03-23T18:34:57.833 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75208.log 2026-03-23T18:34:57.834 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67212.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67212.log.gz 2026-03-23T18:34:57.834 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43133.log 2026-03-23T18:34:57.834 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.45559.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45559.log.gz 2026-03-23T18:34:57.834 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75208.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31099.log 2026-03-23T18:34:57.834 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75208.log.gz 2026-03-23T18:34:57.834 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43133.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66224.log 2026-03-23T18:34:57.834 INFO:teuthology.orchestra.run.vm04.stderr: 58.4% -- replaced with /var/log/ceph/ceph-client.admin.43133.log.gz 2026-03-23T18:34:57.834 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36192.log 2026-03-23T18:34:57.835 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31099.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31099.log.gz 2026-03-23T18:34:57.835 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51583.log 2026-03-23T18:34:57.835 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66224.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66224.log.gz 2026-03-23T18:34:57.835 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36192.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39198.log 2026-03-23T18:34:57.835 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36192.log.gz 2026-03-23T18:34:57.835 INFO:teuthology.orchestra.run.vm04.stderr:gzip/var/log/ceph/ceph-client.admin.51583.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.65221.log 2026-03-23T18:34:57.835 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51583.log.gz 2026-03-23T18:34:57.835 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72094.log 2026-03-23T18:34:57.835 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.39198.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71071.log 2026-03-23T18:34:57.836 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65221.log: 0.0% 25.4% -- replaced with /var/log/ceph/ceph-client.admin.39198.log.gz 2026-03-23T18:34:57.836 INFO:teuthology.orchestra.run.vm04.stderr: -- replaced with /var/log/ceph/ceph-client.admin.65221.log.gz 2026-03-23T18:34:57.836 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72094.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84290.log 2026-03-23T18:34:57.836 INFO:teuthology.orchestra.run.vm04.stderr: 58.6% -- replaced with /var/log/ceph/ceph-client.admin.72094.log.gz 2026-03-23T18:34:57.836 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71071.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71071.log.gz 2026-03-23T18:34:57.836 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37314.log 2026-03-23T18:34:57.836 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84290.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90046.log 2026-03-23T18:34:57.836 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84290.log.gz 2026-03-23T18:34:57.836 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75619.log 2026-03-23T18:34:57.837 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37314.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37314.log.gz 2026-03-23T18:34:57.837 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90046.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32824.log 2026-03-23T18:34:57.837 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90046.log.gz 2026-03-23T18:34:57.837 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75619.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45927.log 2026-03-23T18:34:57.837 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75619.log.gz 2026-03-23T18:34:57.837 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31872.log 2026-03-23T18:34:57.837 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32824.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32824.log.gz 2026-03-23T18:34:57.837 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57649.log 2026-03-23T18:34:57.837 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.45927.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45927.log.gz 2026-03-23T18:34:57.838 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37144.log 2026-03-23T18:34:57.838 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31872.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31872.log.gz 2026-03-23T18:34:57.838 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.57649.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27816.log 2026-03-23T18:34:57.838 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57649.log.gz 2026-03-23T18:34:57.838 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66747.log 2026-03-23T18:34:57.838 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37144.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37144.log.gz 2026-03-23T18:34:57.838 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85698.log 2026-03-23T18:34:57.838 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27816.log: /var/log/ceph/ceph-client.admin.66747.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43932.log 2026-03-23T18:34:57.838 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66747.log.gz 2026-03-23T18:34:57.838 INFO:teuthology.orchestra.run.vm04.stderr: -- replaced with /var/log/ceph/ceph-client.admin.27816.log.gz 2026-03-23T18:34:57.839 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73366.log 2026-03-23T18:34:57.839 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85698.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85698.log.gz 2026-03-23T18:34:57.839 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27919.log 2026-03-23T18:34:57.839 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43932.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43932.log.gz 2026-03-23T18:34:57.839 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73366.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73366.log.gz 2026-03-23T18:34:57.839 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32892.log 2026-03-23T18:34:57.839 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27919.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33775.log 2026-03-23T18:34:57.840 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32892.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40579.log 2026-03-23T18:34:57.840 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32892.log.gz 2026-03-23T18:34:57.840 INFO:teuthology.orchestra.run.vm04.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.27919.log.gz 2026-03-23T18:34:57.840 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33775.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88350.log 2026-03-23T18:34:57.840 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66284.log 2026-03-23T18:34:57.840 INFO:teuthology.orchestra.run.vm04.stderr: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.33775.log.gz 2026-03-23T18:34:57.840 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.40579.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40579.log.gz 2026-03-23T18:34:57.840 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65084.log 2026-03-23T18:34:57.840 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88350.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88350.log.gz 2026-03-23T18:34:57.840 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66284.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57563.log 2026-03-23T18:34:57.841 INFO:teuthology.orchestra.run.vm04.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.66284.log.gz 2026-03-23T18:34:57.841 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65084.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36158.log 2026-03-23T18:34:57.841 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65084.log.gz 2026-03-23T18:34:57.841 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80055.log 2026-03-23T18:34:57.841 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.57563.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57563.log.gz 2026-03-23T18:34:57.841 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83214.log 2026-03-23T18:34:57.841 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36158.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36158.log.gz 2026-03-23T18:34:57.841 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80055.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49707.log 2026-03-23T18:34:57.841 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80055.log.gz 2026-03-23T18:34:57.841 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70603.log 2026-03-23T18:34:57.841 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83214.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83214.log.gz 2026-03-23T18:34:57.842 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41808.log 2026-03-23T18:34:57.842 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49707.log: /var/log/ceph/ceph-client.admin.70603.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49707.log.gz 2026-03-23T18:34:57.842 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70603.log.gz 2026-03-23T18:34:57.842 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83042.log 2026-03-23T18:34:57.842 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.41808.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59981.log 2026-03-23T18:34:57.842 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41808.log.gz 2026-03-23T18:34:57.842 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83042.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83042.log.gz 2026-03-23T18:34:57.842 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35140.log 2026-03-23T18:34:57.843 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59981.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50504.log 2026-03-23T18:34:57.843 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59981.log.gz 2026-03-23T18:34:57.843 INFO:teuthology.orchestra.run.vm04.stderr:gzip/var/log/ceph/ceph-client.admin.35140.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.45065.log 2026-03-23T18:34:57.843 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50504.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88252.log 2026-03-23T18:34:57.843 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50504.log.gz 2026-03-23T18:34:57.843 INFO:teuthology.orchestra.run.vm04.stderr: 25.0% -- replaced with /var/log/ceph/ceph-client.admin.35140.log.gz 2026-03-23T18:34:57.843 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.45065.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69038.log 2026-03-23T18:34:57.843 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45065.log.gz 2026-03-23T18:34:57.843 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63320.log 2026-03-23T18:34:57.844 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88252.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88252.log.gz 2026-03-23T18:34:57.844 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69038.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87660.log 2026-03-23T18:34:57.844 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69038.log.gz 2026-03-23T18:34:57.844 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79071.log 2026-03-23T18:34:57.844 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63320.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63320.log.gz 2026-03-23T18:34:57.844 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87660.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74139.log 2026-03-23T18:34:57.844 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87660.log.gz 2026-03-23T18:34:57.844 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5/var/log/ceph/ceph-client.admin.79071.log: --verbose -- /var/log/ceph/ceph-client.admin.44288.log 2026-03-23T18:34:57.844 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79071.log.gz 2026-03-23T18:34:57.845 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89250.log 2026-03-23T18:34:57.845 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74139.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74139.log.gz 2026-03-23T18:34:57.845 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75723.log 2026-03-23T18:34:57.845 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44288.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44288.log.gz 2026-03-23T18:34:57.845 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89250.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89250.log.gz 2026-03-23T18:34:57.845 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80318.log 2026-03-23T18:34:57.845 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66009.log 2026-03-23T18:34:57.845 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75723.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75723.log.gz 2026-03-23T18:34:57.845 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72297.log 2026-03-23T18:34:57.846 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80318.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80318.log.gz 2026-03-23T18:34:57.846 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66009.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66009.log.gz 2026-03-23T18:34:57.846 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77495.log 2026-03-23T18:34:57.846 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41378.log 2026-03-23T18:34:57.846 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72297.log: 58.1% -- replaced with /var/log/ceph/ceph-client.admin.72297.log.gz 2026-03-23T18:34:57.846 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81159.log 2026-03-23T18:34:57.846 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77495.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77495.log.gz 2026-03-23T18:34:57.846 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.41378.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38400.log 2026-03-23T18:34:57.846 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41378.log.gz 2026-03-23T18:34:57.847 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81159.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47003.log 2026-03-23T18:34:57.847 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81159.log.gz 2026-03-23T18:34:57.847 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25812.log 2026-03-23T18:34:57.847 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.38400.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46228.log 2026-03-23T18:34:57.847 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.47003.log: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.38400.log.gz 2026-03-23T18:34:57.847 INFO:teuthology.orchestra.run.vm04.stderr: 0.0%/var/log/ceph/ceph-client.admin.25812.log: -- replaced with /var/log/ceph/ceph-client.admin.47003.log.gz 2026-03-23T18:34:57.847 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44151.log 2026-03-23T18:34:57.847 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25812.log.gz 2026-03-23T18:34:57.847 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34930.log 2026-03-23T18:34:57.847 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46228.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46228.log.gz 2026-03-23T18:34:57.848 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41292.log 2026-03-23T18:34:57.848 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44151.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86367.log 2026-03-23T18:34:57.848 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34930.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.44151.log.gz 2026-03-23T18:34:57.848 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.41292.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42764.log 2026-03-23T18:34:57.848 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41292.log.gz 2026-03-23T18:34:57.848 INFO:teuthology.orchestra.run.vm04.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.34930.log.gz 2026-03-23T18:34:57.848 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86367.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76650.log 2026-03-23T18:34:57.848 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86367.log.gz 2026-03-23T18:34:57.848 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50922.log 2026-03-23T18:34:57.849 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42764.log: /var/log/ceph/ceph-client.admin.76650.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69728.log 2026-03-23T18:34:57.849 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76650.log.gz 2026-03-23T18:34:57.849 INFO:teuthology.orchestra.run.vm04.stderr: 59.4% -- replaced with /var/log/ceph/ceph-client.admin.42764.log.gz 2026-03-23T18:34:57.849 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32609.log 2026-03-23T18:34:57.849 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50922.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50922.log.gz 2026-03-23T18:34:57.849 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69728.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40279.log 2026-03-23T18:34:57.849 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69728.log.gz 2026-03-23T18:34:57.849 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32609.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35245.log 2026-03-23T18:34:57.849 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32609.log.gz 2026-03-23T18:34:57.849 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44692.log 2026-03-23T18:34:57.850 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.40279.log: 0.0%/var/log/ceph/ceph-client.admin.35245.log: -- replaced with /var/log/ceph/ceph-client.admin.40279.log.gz 2026-03-23T18:34:57.850 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69889.log 2026-03-23T18:34:57.850 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35245.log.gz 2026-03-23T18:34:57.850 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59862.log 2026-03-23T18:34:57.850 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44692.log: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.44692.log.gz 2026-03-23T18:34:57.850 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27020.log 2026-03-23T18:34:57.850 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59862.log: /var/log/ceph/ceph-client.admin.69889.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69889.log.gz 2026-03-23T18:34:57.851 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59862.log.gzgzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74684.log 2026-03-23T18:34:57.851 INFO:teuthology.orchestra.run.vm04.stderr: 2026-03-23T18:34:57.851 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27020.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91385.log 2026-03-23T18:34:57.851 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27020.log.gz 2026-03-23T18:34:57.851 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58269.log 2026-03-23T18:34:57.851 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74684.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74684.log.gz 2026-03-23T18:34:57.851 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49836.log 2026-03-23T18:34:57.851 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91385.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91385.log.gz 2026-03-23T18:34:57.851 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69929.log 2026-03-23T18:34:57.851 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58269.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58269.log.gz 2026-03-23T18:34:57.852 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49836.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86453.log 2026-03-23T18:34:57.852 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49836.log.gz 2026-03-23T18:34:57.852 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74340.log 2026-03-23T18:34:57.852 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69929.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69929.log.gz 2026-03-23T18:34:57.852 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81417.log 2026-03-23T18:34:57.852 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86453.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86453.log.gz 2026-03-23T18:34:57.852 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74340.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50675.log 0.0% 2026-03-23T18:34:57.852 INFO:teuthology.orchestra.run.vm04.stderr: -- replaced with /var/log/ceph/ceph-client.admin.74340.log.gz 2026-03-23T18:34:57.852 INFO:teuthology.orchestra.run.vm04.stderr:gzip/var/log/ceph/ceph-client.admin.81417.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.32773.log 2026-03-23T18:34:57.852 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81417.log.gz 2026-03-23T18:34:57.853 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78872.log 2026-03-23T18:34:57.853 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50675.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50675.log.gz 2026-03-23T18:34:57.853 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32773.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54521.log 2026-03-23T18:34:57.853 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32773.log.gz 2026-03-23T18:34:57.853 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78872.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34531.log 2026-03-23T18:34:57.853 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78872.log.gz 2026-03-23T18:34:57.853 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.54521.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76088.log 2026-03-23T18:34:57.853 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54521.log.gz 2026-03-23T18:34:57.853 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34531.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86171.log 2026-03-23T18:34:57.854 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76088.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73606.log 2026-03-23T18:34:57.854 INFO:teuthology.orchestra.run.vm04.stderr: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.34531.log.gz 2026-03-23T18:34:57.854 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76088.log.gz 2026-03-23T18:34:57.854 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81887.log 2026-03-23T18:34:57.854 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86171.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86171.log.gz 2026-03-23T18:34:57.854 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33589.log 2026-03-23T18:34:57.855 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73606.log: /var/log/ceph/ceph-client.admin.81887.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73606.log.gz 2026-03-23T18:34:57.855 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81973.log 2026-03-23T18:34:57.855 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81887.log.gz 2026-03-23T18:34:57.855 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33589.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33589.log.gz 2026-03-23T18:34:57.855 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80490.log 2026-03-23T18:34:57.855 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57351.log 2026-03-23T18:34:57.855 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81973.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81973.log.gz 2026-03-23T18:34:57.855 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54261.log 2026-03-23T18:34:57.855 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80490.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80490.log.gz 2026-03-23T18:34:57.855 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.57351.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39072.log 2026-03-23T18:34:57.855 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57351.log.gz 2026-03-23T18:34:57.855 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.54261.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32149.log 2026-03-23T18:34:57.855 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54261.log.gz 2026-03-23T18:34:57.856 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89895.log 2026-03-23T18:34:57.856 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.39072.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32184.log 2026-03-23T18:34:57.856 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32149.log: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.39072.log.gz 2026-03-23T18:34:57.856 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32149.log.gz 2026-03-23T18:34:57.856 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89895.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73774.log 0.0% 2026-03-23T18:34:57.856 INFO:teuthology.orchestra.run.vm04.stderr: -- replaced with /var/log/ceph/ceph-client.admin.89895.log.gz 2026-03-23T18:34:57.856 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32184.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54234.log 2026-03-23T18:34:57.856 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32184.log.gz 2026-03-23T18:34:57.856 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33470.log 2026-03-23T18:34:57.857 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73774.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73774.log.gz/var/log/ceph/ceph-client.admin.54234.log: 2026-03-23T18:34:57.857 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56183.log 2026-03-23T18:34:57.857 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54234.log.gz 2026-03-23T18:34:57.857 INFO:teuthology.orchestra.run.vm04.stderr:gzip/var/log/ceph/ceph-client.admin.33470.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.76757.log 2026-03-23T18:34:57.857 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33470.log.gz 2026-03-23T18:34:57.857 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27532.log 2026-03-23T18:34:57.857 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56183.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56183.log.gz 2026-03-23T18:34:57.857 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27940.log 2026-03-23T18:34:57.857 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76757.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76757.log.gz 2026-03-23T18:34:57.858 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27532.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32235.log 2026-03-23T18:34:57.858 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27532.log.gz 2026-03-23T18:34:57.858 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33266.log 2026-03-23T18:34:57.858 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27940.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44188.log 2026-03-23T18:34:57.858 INFO:teuthology.orchestra.run.vm04.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.27940.log.gz 2026-03-23T18:34:57.858 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32235.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32235.log.gz 2026-03-23T18:34:57.858 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33266.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58875.log 2026-03-23T18:34:57.858 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33266.log.gz 2026-03-23T18:34:57.859 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44188.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57499.log 2026-03-23T18:34:57.859 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51562.log 2026-03-23T18:34:57.859 INFO:teuthology.orchestra.run.vm04.stderr: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.44188.log.gz 2026-03-23T18:34:57.859 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58875.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58875.log.gz 2026-03-23T18:34:57.859 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69317.log 2026-03-23T18:34:57.859 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.57499.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57499.log.gz 2026-03-23T18:34:57.859 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-mgr.x.log 2026-03-23T18:34:57.859 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.51562.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51562.log.gz 2026-03-23T18:34:57.859 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69317.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.22814.log 2026-03-23T18:34:57.859 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69317.log.gz 2026-03-23T18:34:57.860 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-mgr.x.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66554.log 2026-03-23T18:34:57.860 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.22814.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70581.log 2026-03-23T18:34:57.860 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.22814.log.gz 2026-03-23T18:34:57.860 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78365.log 2026-03-23T18:34:57.860 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70581.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39892.log 2026-03-23T18:34:57.860 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70581.log.gz 2026-03-23T18:34:57.860 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80382.log 2026-03-23T18:34:57.861 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78365.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78365.log.gz 2026-03-23T18:34:57.861 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47270.log 2026-03-23T18:34:57.861 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.39892.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39892.log.gz 2026-03-23T18:34:57.861 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80382.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68264.log 2026-03-23T18:34:57.861 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80382.log.gz 2026-03-23T18:34:57.861 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47657.log 2026-03-23T18:34:57.861 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.47270.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47270.log.gz 2026-03-23T18:34:57.861 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83665.log 2026-03-23T18:34:57.861 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68264.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68264.log.gz 2026-03-23T18:34:57.862 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.47657.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51734.log 2026-03-23T18:34:57.862 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47657.log.gz 2026-03-23T18:34:57.862 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70130.log 2026-03-23T18:34:57.862 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83665.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83665.log.gz 2026-03-23T18:34:57.862 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73108.log 2026-03-23T18:34:57.862 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.51734.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51734.log.gz 2026-03-23T18:34:57.862 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70130.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25710.log 2026-03-23T18:34:57.862 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70130.log.gz 2026-03-23T18:34:57.862 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38883.log 2026-03-23T18:34:57.863 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73108.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73108.log.gz 2026-03-23T18:34:57.863 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65028.log 2026-03-23T18:34:57.863 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.25710.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25710.log.gz 2026-03-23T18:34:57.863 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.38883.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31551.log 2026-03-23T18:34:57.863 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65028.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.38883.log.gz 2026-03-23T18:34:57.863 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65028.log.gz 2026-03-23T18:34:57.863 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28370.log 2026-03-23T18:34:57.863 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31551.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71200.log 2026-03-23T18:34:57.863 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31551.log.gz 2026-03-23T18:34:57.864 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35767.log 2026-03-23T18:34:57.864 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.28370.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28370.log.gz 2026-03-23T18:34:57.864 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62545.log 2026-03-23T18:34:57.864 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71200.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71200.log.gz 2026-03-23T18:34:57.864 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.35767.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87423.log 2026-03-23T18:34:57.864 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35767.log.gz 2026-03-23T18:34:57.864 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46530.log 2026-03-23T18:34:57.864 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62545.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62545.log.gz 2026-03-23T18:34:57.864 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65633.log 2026-03-23T18:34:57.865 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87423.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87423.log.gz 2026-03-23T18:34:57.865 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34741.log 2026-03-23T18:34:57.865 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46530.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46530.log.gz 2026-03-23T18:34:57.865 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65633.log: 55.5% -- replaced with /var/log/ceph/ceph-client.admin.65633.log.gz 2026-03-23T18:34:57.865 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34174.log 2026-03-23T18:34:57.865 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35098.log 2026-03-23T18:34:57.866 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34741.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33649.log 2026-03-23T18:34:57.866 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34174.log: /var/log/ceph/ceph-client.admin.35098.log: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.34741.log.gz 2026-03-23T18:34:57.866 INFO:teuthology.orchestra.run.vm04.stderr: 27.1% -- replaced with /var/log/ceph/ceph-client.admin.34174.log.gzgzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29548.log 2026-03-23T18:34:57.866 INFO:teuthology.orchestra.run.vm04.stderr: 2026-03-23T18:34:57.866 INFO:teuthology.orchestra.run.vm04.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.35098.log.gz 2026-03-23T18:34:57.867 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33649.log: 24.9% -- replaced with /var/log/ceph/ceph-client.admin.33649.log.gz 2026-03-23T18:34:57.867 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81202.log 2026-03-23T18:34:57.867 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66554.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66554.log.gzgzip 2026-03-23T18:34:57.867 INFO:teuthology.orchestra.run.vm04.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.50203.log 2026-03-23T18:34:57.867 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29548.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29548.log.gz 2026-03-23T18:34:57.867 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81202.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81202.log.gzgzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65458.log 2026-03-23T18:34:57.867 INFO:teuthology.orchestra.run.vm04.stderr: 2026-03-23T18:34:57.867 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64337.log 2026-03-23T18:34:57.868 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50203.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50203.log.gz 2026-03-23T18:34:57.868 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65458.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67664.log 2026-03-23T18:34:57.868 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65458.log.gz 2026-03-23T18:34:57.868 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87918.log 2026-03-23T18:34:57.868 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64337.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64337.log.gz 2026-03-23T18:34:57.868 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64950.log 2026-03-23T18:34:57.868 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67664.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67664.log.gz 2026-03-23T18:34:57.868 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87918.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48632.log 2026-03-23T18:34:57.868 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87918.log.gz 2026-03-23T18:34:57.869 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39978.log 2026-03-23T18:34:57.869 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64950.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64950.log.gz 2026-03-23T18:34:57.869 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31041.log 2026-03-23T18:34:57.869 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.48632.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48632.log.gz 2026-03-23T18:34:57.869 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.39978.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59901.log 2026-03-23T18:34:57.869 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39978.log.gz 2026-03-23T18:34:57.869 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68114.log 2026-03-23T18:34:57.869 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31041.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31041.log.gz 2026-03-23T18:34:57.869 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57520.log 2026-03-23T18:34:57.870 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59901.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59901.log.gz 2026-03-23T18:34:57.870 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68114.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67791.log 2026-03-23T18:34:57.870 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68114.log.gz 2026-03-23T18:34:57.870 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71029.log 2026-03-23T18:34:57.870 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.57520.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57520.log.gz 2026-03-23T18:34:57.870 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90175.log 2026-03-23T18:34:57.870 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67791.log: /var/log/ceph/ceph-client.admin.71029.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67272.log 2026-03-23T18:34:57.871 INFO:teuthology.orchestra.run.vm04.stderr: 58.2% -- replaced with /var/log/ceph/ceph-client.admin.71029.log.gz 2026-03-23T18:34:57.871 INFO:teuthology.orchestra.run.vm04.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.67791.log.gz 2026-03-23T18:34:57.871 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90175.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90175.log.gz 2026-03-23T18:34:57.871 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33300.log 2026-03-23T18:34:57.871 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65767.log 2026-03-23T18:34:57.871 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67272.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67272.log.gz 2026-03-23T18:34:57.871 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33300.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74021.log 2026-03-23T18:34:57.871 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33300.log.gz 2026-03-23T18:34:57.871 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54019.log 2026-03-23T18:34:57.871 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65767.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65767.log.gz 2026-03-23T18:34:57.872 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88132.log 2026-03-23T18:34:57.872 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74021.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74021.log.gz 2026-03-23T18:34:57.872 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.54019.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81008.log 2026-03-23T18:34:57.872 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54019.log.gz 2026-03-23T18:34:57.872 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68802.log 2026-03-23T18:34:57.872 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88132.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88132.log.gz 2026-03-23T18:34:57.872 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87054.log 2026-03-23T18:34:57.872 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81008.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81008.log.gz 2026-03-23T18:34:57.872 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68802.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60788.log 2026-03-23T18:34:57.873 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68802.log.gz 2026-03-23T18:34:57.873 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84874.log 2026-03-23T18:34:57.873 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87054.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87054.log.gz 2026-03-23T18:34:57.873 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79836.log 2026-03-23T18:34:57.873 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60788.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60788.log.gz 2026-03-23T18:34:57.873 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84874.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62889.log 2026-03-23T18:34:57.873 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84874.log.gz 2026-03-23T18:34:57.873 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79836.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69829.log 2026-03-23T18:34:57.873 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79836.log.gz 2026-03-23T18:34:57.874 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87292.log 2026-03-23T18:34:57.874 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62889.log: /var/log/ceph/ceph-client.admin.69829.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62889.log.gz 2026-03-23T18:34:57.874 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose 0.0% -- /var/log/ceph/ceph-client.admin.86042.log -- replaced with /var/log/ceph/ceph-client.admin.69829.log.gz 2026-03-23T18:34:57.874 INFO:teuthology.orchestra.run.vm04.stderr: 2026-03-23T18:34:57.874 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87292.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43952.log 2026-03-23T18:34:57.874 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87292.log.gz 2026-03-23T18:34:57.874 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88622.log 2026-03-23T18:34:57.874 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86042.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86042.log.gz 2026-03-23T18:34:57.875 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43952.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52053.log 2026-03-23T18:34:57.875 INFO:teuthology.orchestra.run.vm04.stderr: 2.5% -- replaced with /var/log/ceph/ceph-client.admin.43952.log.gz 2026-03-23T18:34:57.875 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88622.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88622.log.gz 2026-03-23T18:34:57.875 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77374.log 2026-03-23T18:34:57.875 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.52053.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83250.log 2026-03-23T18:34:57.875 INFO:teuthology.orchestra.run.vm04.stderr: 57.9% -- replaced with /var/log/ceph/ceph-client.admin.52053.log.gz 2026-03-23T18:34:57.875 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42683.log 2026-03-23T18:34:57.876 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77374.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77374.log.gz 2026-03-23T18:34:57.876 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83250.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53482.log 2026-03-23T18:34:57.876 INFO:teuthology.orchestra.run.vm04.stderr: 86.2% -- replaced with /var/log/ceph/ceph-client.admin.83250.log.gz/var/log/ceph/ceph-client.admin.42683.log: 2026-03-23T18:34:57.876 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60886.log 2026-03-23T18:34:57.876 INFO:teuthology.orchestra.run.vm04.stderr: 52.8% -- replaced with /var/log/ceph/ceph-client.admin.42683.log.gz 2026-03-23T18:34:57.876 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40719.log 2026-03-23T18:34:57.876 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.53482.log: /var/log/ceph/ceph-client.admin.60886.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53482.log.gz 2026-03-23T18:34:57.876 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60886.log.gz 2026-03-23T18:34:57.877 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83251.log 2026-03-23T18:34:57.877 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.40719.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80017.log 2026-03-23T18:34:57.877 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40719.log.gz 2026-03-23T18:34:57.877 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83251.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65867.log 2026-03-23T18:34:57.877 INFO:teuthology.orchestra.run.vm04.stderr: 86.3% -- replaced with /var/log/ceph/ceph-client.admin.83251.log.gz 2026-03-23T18:34:57.877 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62757.log 2026-03-23T18:34:57.877 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80017.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80017.log.gz 2026-03-23T18:34:57.878 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65867.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62176.log 2026-03-23T18:34:57.878 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65867.log.gz 2026-03-23T18:34:57.878 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56892.log 2026-03-23T18:34:57.878 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62757.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62757.log.gz 2026-03-23T18:34:57.878 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60540.log 2026-03-23T18:34:57.878 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62176.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62176.log.gz 2026-03-23T18:34:57.878 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84960.log 2026-03-23T18:34:57.878 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56892.log: /var/log/ceph/ceph-client.admin.60540.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61767.log 2026-03-23T18:34:57.879 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60540.log.gz 2026-03-23T18:34:57.879 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56892.log.gz 2026-03-23T18:34:57.879 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84960.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84960.log.gz 2026-03-23T18:34:57.879 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28714.log 2026-03-23T18:34:57.879 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48925.log 2026-03-23T18:34:57.879 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61767.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61767.log.gz 2026-03-23T18:34:57.879 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35161.log 2026-03-23T18:34:57.880 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.28714.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28714.log.gz 2026-03-23T18:34:57.880 INFO:teuthology.orchestra.run.vm04.stderr:gzip/var/log/ceph/ceph-client.admin.48925.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.51992.log 2026-03-23T18:34:57.880 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77285.log 2026-03-23T18:34:57.880 INFO:teuthology.orchestra.run.vm04.stderr: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.48925.log.gz 2026-03-23T18:34:57.880 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.35161.log: /var/log/ceph/ceph-client.admin.51992.log: 0.0% 25.6% -- replaced with /var/log/ceph/ceph-client.admin.35161.log.gz 2026-03-23T18:34:57.880 INFO:teuthology.orchestra.run.vm04.stderr: -- replaced with /var/log/ceph/ceph-client.admin.51992.log.gz 2026-03-23T18:34:57.880 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69124.log 2026-03-23T18:34:57.880 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40171.log 2026-03-23T18:34:57.881 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77285.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77285.log.gz 2026-03-23T18:34:57.881 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69124.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82333.log 2026-03-23T18:34:57.881 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69124.log.gz 2026-03-23T18:34:57.881 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.40171.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42622.log 2026-03-23T18:34:57.881 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40171.log.gz 2026-03-23T18:34:57.881 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42584.log 2026-03-23T18:34:57.881 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82333.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82333.log.gz 2026-03-23T18:34:57.881 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42622.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33249.log 2026-03-23T18:34:57.881 INFO:teuthology.orchestra.run.vm04.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.42622.log.gz 2026-03-23T18:34:57.882 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42584.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41098.log 2026-03-23T18:34:57.882 INFO:teuthology.orchestra.run.vm04.stderr: 56.2% -- replaced with /var/log/ceph/ceph-client.admin.42584.log.gz 2026-03-23T18:34:57.882 INFO:teuthology.orchestra.run.vm04.stderr:gzip/var/log/ceph/ceph-client.admin.33249.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.49945.log 2026-03-23T18:34:57.882 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33249.log.gz 2026-03-23T18:34:57.882 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79870.log 2026-03-23T18:34:57.882 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.41098.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80554.log 2026-03-23T18:34:57.883 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49945.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49945.log.gz 26.0% -- replaced with /var/log/ceph/ceph-client.admin.41098.log.gz 2026-03-23T18:34:57.883 INFO:teuthology.orchestra.run.vm04.stderr: 2026-03-23T18:34:57.883 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79870.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79870.log.gz 2026-03-23T18:34:57.883 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32166.log 2026-03-23T18:34:57.883 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73965.log 2026-03-23T18:34:57.883 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80554.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80554.log.gz 2026-03-23T18:34:57.883 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72318.log 2026-03-23T18:34:57.883 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32166.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32166.log.gz 2026-03-23T18:34:57.883 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54915.log 2026-03-23T18:34:57.883 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73965.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73965.log.gz 2026-03-23T18:34:57.884 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72318.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72318.log.gz 2026-03-23T18:34:57.884 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32643.log 2026-03-23T18:34:57.884 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31228.log 2026-03-23T18:34:57.884 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.54915.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54915.log.gz 2026-03-23T18:34:57.884 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76260.log 2026-03-23T18:34:57.884 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32643.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32643.log.gz 2026-03-23T18:34:57.885 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31228.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84594.log 2026-03-23T18:34:57.885 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31228.log.gz 2026-03-23T18:34:57.885 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39318.log 2026-03-23T18:34:57.885 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76260.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76260.log.gz 2026-03-23T18:34:57.885 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58058.log 2026-03-23T18:34:57.885 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84594.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84594.log.gz 2026-03-23T18:34:57.885 INFO:teuthology.orchestra.run.vm04.stderr:gzip/var/log/ceph/ceph-client.admin.39318.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.75148.log 2026-03-23T18:34:57.885 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63488.log 2026-03-23T18:34:57.885 INFO:teuthology.orchestra.run.vm04.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.39318.log.gz 2026-03-23T18:34:57.885 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58058.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58058.log.gz 2026-03-23T18:34:57.886 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75148.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75148.log.gz 2026-03-23T18:34:57.886 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65948.log 2026-03-23T18:34:57.886 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28757.log 2026-03-23T18:34:57.886 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63488.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63488.log.gz 2026-03-23T18:34:57.886 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65948.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62988.log 2026-03-23T18:34:57.886 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65948.log.gz 2026-03-23T18:34:57.886 INFO:teuthology.orchestra.run.vm04.stderr:gzip/var/log/ceph/ceph-client.admin.28757.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.65281.log 2026-03-23T18:34:57.886 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28757.log.gz 2026-03-23T18:34:57.887 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27257.log 2026-03-23T18:34:57.887 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62988.log: /var/log/ceph/ceph-client.admin.65281.log: 58.1% -- replaced with /var/log/ceph/ceph-client.admin.62988.log.gz 2026-03-23T18:34:57.887 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46901.log 2026-03-23T18:34:57.887 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65281.log.gz 2026-03-23T18:34:57.887 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27257.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29719.log 2026-03-23T18:34:57.887 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27257.log.gz 2026-03-23T18:34:57.887 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55059.log 2026-03-23T18:34:57.887 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46901.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46901.log.gz 2026-03-23T18:34:57.888 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29719.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58243.log 2026-03-23T18:34:57.888 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29719.log.gz 2026-03-23T18:34:57.888 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28413.log 2026-03-23T18:34:57.888 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.55059.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55059.log.gz 2026-03-23T18:34:57.888 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37178.log 2026-03-23T18:34:57.888 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58243.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58243.log.gz 2026-03-23T18:34:57.888 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.28413.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49535.log 2026-03-23T18:34:57.888 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28413.log.gz 2026-03-23T18:34:57.888 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41700.log 2026-03-23T18:34:57.889 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37178.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37178.log.gz 2026-03-23T18:34:57.889 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81866.log 2026-03-23T18:34:57.889 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49535.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49535.log.gz 2026-03-23T18:34:57.889 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.41700.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58772.log 2026-03-23T18:34:57.889 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41700.log.gz 2026-03-23T18:34:57.889 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39507.log 2026-03-23T18:34:57.889 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81866.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81866.log.gz 2026-03-23T18:34:57.889 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60768.log 2026-03-23T18:34:57.890 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58772.log: /var/log/ceph/ceph-client.admin.39507.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.58772.log.gz 2026-03-23T18:34:57.890 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75055.log 2026-03-23T18:34:57.890 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60768.log: 25.7% -- replaced with /var/log/ceph/ceph-client.admin.39507.log.gz 2026-03-23T18:34:57.890 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60768.log.gz 2026-03-23T18:34:57.890 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29977.log 2026-03-23T18:34:57.890 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75055.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82244.log 2026-03-23T18:34:57.890 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75055.log.gz 2026-03-23T18:34:57.890 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56930.log 2026-03-23T18:34:57.891 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29977.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29977.log.gz 2026-03-23T18:34:57.891 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67460.log 2026-03-23T18:34:57.891 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82244.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82244.log.gz 2026-03-23T18:34:57.891 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56930.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41078.log 2026-03-23T18:34:57.891 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56930.log.gz 2026-03-23T18:34:57.891 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79698.log 2026-03-23T18:34:57.891 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67460.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67460.log.gz 2026-03-23T18:34:57.891 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56247.log 2026-03-23T18:34:57.892 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.41078.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41078.log.gz 2026-03-23T18:34:57.892 INFO:teuthology.orchestra.run.vm04.stderr:gzip/var/log/ceph/ceph-client.admin.79698.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.78742.log 2026-03-23T18:34:57.892 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79698.log.gz 2026-03-23T18:34:57.892 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56247.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56247.log.gz 2026-03-23T18:34:57.892 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71009.log 2026-03-23T18:34:57.892 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87617.log 2026-03-23T18:34:57.892 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78742.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78742.log.gz 2026-03-23T18:34:57.893 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76131.log 2026-03-23T18:34:57.893 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71009.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71009.log.gz 2026-03-23T18:34:57.893 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54062.log 2026-03-23T18:34:57.893 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87617.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87617.log.gz 2026-03-23T18:34:57.893 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65104.log 2026-03-23T18:34:57.893 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76131.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76131.log.gz 2026-03-23T18:34:57.893 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91411.log 2026-03-23T18:34:57.893 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.54062.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54062.log.gz 2026-03-23T18:34:57.893 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33011.log 2026-03-23T18:34:57.893 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65104.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65104.log.gz 2026-03-23T18:34:57.894 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91411.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91411.log.gz 2026-03-23T18:34:57.894 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85934.log 2026-03-23T18:34:57.894 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65807.log 2026-03-23T18:34:57.894 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33011.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33011.log.gz 2026-03-23T18:34:57.894 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32626.log 2026-03-23T18:34:57.894 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85934.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85934.log.gz 2026-03-23T18:34:57.895 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65807.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49277.log 2026-03-23T18:34:57.895 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65807.log.gz 2026-03-23T18:34:57.895 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41399.log 2026-03-23T18:34:57.895 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32626.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32626.log.gz 2026-03-23T18:34:57.895 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47850.log 2026-03-23T18:34:57.895 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49277.log: /var/log/ceph/ceph-client.admin.41399.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37640.log 2026-03-23T18:34:57.895 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41399.log.gz 2026-03-23T18:34:57.895 INFO:teuthology.orchestra.run.vm04.stderr: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.49277.log.gz 2026-03-23T18:34:57.895 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.47850.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47850.log.gz 2026-03-23T18:34:57.896 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27398.log 2026-03-23T18:34:57.896 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87114.log 2026-03-23T18:34:57.896 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37640.log: /var/log/ceph/ceph-client.admin.27398.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48117.log 2026-03-23T18:34:57.896 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27398.log.gz 26.7% -- replaced with /var/log/ceph/ceph-client.admin.37640.log.gz 2026-03-23T18:34:57.896 INFO:teuthology.orchestra.run.vm04.stderr: 2026-03-23T18:34:57.896 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87114.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87114.log.gz 2026-03-23T18:34:57.896 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83858.log 2026-03-23T18:34:57.897 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87982.log 2026-03-23T18:34:57.897 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.48117.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48117.log.gz 2026-03-23T18:34:57.897 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80662.log 2026-03-23T18:34:57.897 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83858.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83858.log.gz 2026-03-23T18:34:57.897 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87982.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41614.log 2026-03-23T18:34:57.897 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87982.log.gz 2026-03-23T18:34:57.897 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80662.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80662.log.gz 2026-03-23T18:34:57.897 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46921.log 2026-03-23T18:34:57.898 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41571.log 2026-03-23T18:34:57.898 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.41614.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41614.log.gz 2026-03-23T18:34:57.898 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28894.log 2026-03-23T18:34:57.898 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46921.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46921.log.gz 2026-03-23T18:34:57.898 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.41571.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83249.log 2026-03-23T18:34:57.898 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41571.log.gz 2026-03-23T18:34:57.898 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87254.log 2026-03-23T18:34:57.899 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83249.log: /var/log/ceph/ceph-client.admin.28894.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28894.log.gz 2026-03-23T18:34:57.899 INFO:teuthology.orchestra.run.vm04.stderr: 89.0% -- replaced with /var/log/ceph/ceph-client.admin.83249.log.gz 2026-03-23T18:34:57.899 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86711.log 2026-03-23T18:34:57.899 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76564.log 2026-03-23T18:34:57.899 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87254.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87254.log.gz 2026-03-23T18:34:57.899 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46142.log 2026-03-23T18:34:57.899 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86711.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86711.log.gz 2026-03-23T18:34:57.900 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76564.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89293.log 2026-03-23T18:34:57.900 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76564.log.gz 2026-03-23T18:34:57.900 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85114.log 2026-03-23T18:34:57.900 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46142.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46142.log.gz 2026-03-23T18:34:57.900 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85003.log 2026-03-23T18:34:57.900 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89293.log: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.89293.log.gz 2026-03-23T18:34:57.900 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86990.log 2026-03-23T18:34:57.900 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85114.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85114.log.gz 2026-03-23T18:34:57.900 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85003.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85003.log.gz 2026-03-23T18:34:57.901 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33113.log 2026-03-23T18:34:57.901 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74283.log 2026-03-23T18:34:57.901 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86990.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86990.log.gz 2026-03-23T18:34:57.901 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34489.log 2026-03-23T18:34:57.901 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33113.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33113.log.gz 2026-03-23T18:34:57.901 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62412.log 2026-03-23T18:34:57.901 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74283.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74283.log.gz 2026-03-23T18:34:57.902 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34468.log 2026-03-23T18:34:57.902 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34489.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45043.log 2026-03-23T18:34:57.902 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62412.log: 0.0% 25.6% -- replaced with /var/log/ceph/ceph-client.admin.34489.log.gz 2026-03-23T18:34:57.902 INFO:teuthology.orchestra.run.vm04.stderr: -- replaced with /var/log/ceph/ceph-client.admin.62412.log.gz 2026-03-23T18:34:57.902 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34468.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81567.log 2026-03-23T18:34:57.903 INFO:teuthology.orchestra.run.vm04.stderr: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.34468.log.gz 2026-03-23T18:34:57.903 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.45043.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45043.log.gz 2026-03-23T18:34:57.903 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86128.log 2026-03-23T18:34:57.903 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81567.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42222.log 2026-03-23T18:34:57.903 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81567.log.gz 2026-03-23T18:34:57.903 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28435.log 2026-03-23T18:34:57.903 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86128.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86128.log.gz 2026-03-23T18:34:57.903 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28671.log 2026-03-23T18:34:57.904 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42222.log: /var/log/ceph/ceph-client.admin.28435.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73495.log 2026-03-23T18:34:57.904 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% 54.6% -- replaced with /var/log/ceph/ceph-client.admin.42222.log.gz 2026-03-23T18:34:57.904 INFO:teuthology.orchestra.run.vm04.stderr: -- replaced with /var/log/ceph/ceph-client.admin.28435.log.gz 2026-03-23T18:34:57.904 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.28671.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28671.log.gz 2026-03-23T18:34:57.904 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70007.log 2026-03-23T18:34:57.904 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57119.log 2026-03-23T18:34:57.904 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73495.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73495.log.gz 2026-03-23T18:34:57.904 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71114.log 2026-03-23T18:34:57.905 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70007.log: 52.7% -- replaced with /var/log/ceph/ceph-client.admin.70007.log.gz/var/log/ceph/ceph-client.admin.57119.log: 2026-03-23T18:34:57.905 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56634.log 2026-03-23T18:34:57.905 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57119.log.gz 2026-03-23T18:34:57.905 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71114.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71114.log.gz 2026-03-23T18:34:57.905 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56484.log 2026-03-23T18:34:57.905 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38186.log 2026-03-23T18:34:57.905 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56634.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56634.log.gz 2026-03-23T18:34:57.905 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85344.log 2026-03-23T18:34:57.906 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56484.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56484.log.gz 2026-03-23T18:34:57.906 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.38186.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40936.log 2026-03-23T18:34:57.906 INFO:teuthology.orchestra.run.vm04.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.38186.log.gz 2026-03-23T18:34:57.906 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85344.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85344.log.gz 2026-03-23T18:34:57.906 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80965.log 2026-03-23T18:34:57.906 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36566.log 2026-03-23T18:34:57.907 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80965.log: /var/log/ceph/ceph-client.admin.40936.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80965.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40936.log.gz 2026-03-23T18:34:57.907 INFO:teuthology.orchestra.run.vm04.stderr: 2026-03-23T18:34:57.907 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61939.log 2026-03-23T18:34:57.907 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36566.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73861.log 2026-03-23T18:34:57.907 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36566.log.gz 2026-03-23T18:34:57.907 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66403.log 2026-03-23T18:34:57.907 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61939.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61939.log.gz 2026-03-23T18:34:57.907 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72714.log 2026-03-23T18:34:57.908 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73861.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73861.log.gz 2026-03-23T18:34:57.908 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61574.log 2026-03-23T18:34:57.908 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66403.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66403.log.gz 2026-03-23T18:34:57.908 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72714.log: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.72714.log.gz 2026-03-23T18:34:57.908 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89444.log 2026-03-23T18:34:57.908 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph.tmp-client.admin.19118.log 2026-03-23T18:34:57.909 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61574.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61574.log.gz 2026-03-23T18:34:57.909 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43381.log 2026-03-23T18:34:57.909 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89444.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89444.log.gz 2026-03-23T18:34:57.909 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41872.log 2026-03-23T18:34:57.909 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph.tmp-client.admin.19118.log: 0.0% -- replaced with /var/log/ceph/ceph.tmp-client.admin.19118.log.gz 2026-03-23T18:34:57.909 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56913.log 2026-03-23T18:34:57.909 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43381.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43381.log.gz 2026-03-23T18:34:57.909 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44372.log 2026-03-23T18:34:57.909 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.41872.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41872.log.gz 2026-03-23T18:34:57.909 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31962.log 2026-03-23T18:34:57.910 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56913.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56913.log.gz 2026-03-23T18:34:57.910 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44372.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44372.log.gz 2026-03-23T18:34:57.910 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66618.log 2026-03-23T18:34:57.910 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63029.log 2026-03-23T18:34:57.910 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31962.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31962.log.gz 2026-03-23T18:34:57.910 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56505.log 2026-03-23T18:34:57.911 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66618.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66618.log.gz 2026-03-23T18:34:57.911 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88663.log 2026-03-23T18:34:57.911 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63029.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63029.log.gz 2026-03-23T18:34:57.911 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56505.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56505.log.gz 2026-03-23T18:34:57.911 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83455.log 2026-03-23T18:34:57.911 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78323.log 2026-03-23T18:34:57.911 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88663.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88663.log.gz 2026-03-23T18:34:57.912 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83455.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39177.log 2026-03-23T18:34:57.912 INFO:teuthology.orchestra.run.vm04.stderr: 26.6% -- replaced with /var/log/ceph/ceph-client.admin.83455.log.gz 2026-03-23T18:34:57.912 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61273.log 2026-03-23T18:34:57.912 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78323.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78323.log.gz 2026-03-23T18:34:57.912 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.39177.log: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.39177.log.gz 2026-03-23T18:34:57.913 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87595.log 2026-03-23T18:34:57.913 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60315.log 2026-03-23T18:34:57.913 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61273.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61273.log.gz 2026-03-23T18:34:57.913 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87595.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27124.log 2026-03-23T18:34:57.913 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87595.log.gz 2026-03-23T18:34:57.913 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33901.log 2026-03-23T18:34:57.913 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60315.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60315.log.gz 2026-03-23T18:34:57.914 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27124.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27124.log.gz 2026-03-23T18:34:57.914 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53654.log 2026-03-23T18:34:57.914 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49170.log 2026-03-23T18:34:57.914 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33901.log: /var/log/ceph/ceph-client.admin.53654.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53654.log.gz 2026-03-23T18:34:57.914 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60602.log 2026-03-23T18:34:57.915 INFO:teuthology.orchestra.run.vm04.stderr: 25.7% -- replaced with /var/log/ceph/ceph-client.admin.33901.log.gz 2026-03-23T18:34:57.915 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85088.log 2026-03-23T18:34:57.915 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49170.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49170.log.gz 2026-03-23T18:34:57.915 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70280.log 2026-03-23T18:34:57.915 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60602.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60602.log.gz 2026-03-23T18:34:57.915 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85088.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81073.log 2026-03-23T18:34:57.915 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85088.log.gz 2026-03-23T18:34:57.915 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88687.log 2026-03-23T18:34:57.916 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70280.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70280.log.gz 2026-03-23T18:34:57.916 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71737.log 2026-03-23T18:34:57.916 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81073.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81073.log.gz 2026-03-23T18:34:57.916 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88687.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84073.log 2026-03-23T18:34:57.916 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88687.log.gz 2026-03-23T18:34:57.916 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31894.log 2026-03-23T18:34:57.916 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71737.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71737.log.gz 2026-03-23T18:34:57.916 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41974.log 2026-03-23T18:34:57.917 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84073.log: 58.5% -- replaced with /var/log/ceph/ceph-client.admin.84073.log.gz 2026-03-23T18:34:57.917 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31894.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70924.log 2026-03-23T18:34:57.917 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31894.log.gz 2026-03-23T18:34:57.917 INFO:teuthology.orchestra.run.vm04.stderr:gzip/var/log/ceph/ceph-client.admin.41974.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.26849.log 2026-03-23T18:34:57.917 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41974.log.gz 2026-03-23T18:34:57.917 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32677.log 2026-03-23T18:34:57.918 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70924.log: 58.1% -- replaced with /var/log/ceph/ceph-client.admin.70924.log.gz 2026-03-23T18:34:57.918 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.26849.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68673.log 2026-03-23T18:34:57.918 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26849.log.gz 2026-03-23T18:34:57.918 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42801.log 2026-03-23T18:34:57.918 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32677.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32677.log.gz 2026-03-23T18:34:57.918 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63668.log 2026-03-23T18:34:57.918 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68673.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68673.log.gz 2026-03-23T18:34:57.918 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61445.log 2026-03-23T18:34:57.919 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42801.log: /var/log/ceph/ceph-client.admin.63668.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.42801.log.gz 2026-03-23T18:34:57.919 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63668.log.gz 2026-03-23T18:34:57.919 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82312.log 2026-03-23T18:34:57.919 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58986.log 2026-03-23T18:34:57.919 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61445.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61445.log.gz 2026-03-23T18:34:57.919 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82312.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25705.log 2026-03-23T18:34:57.919 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82312.log.gz 2026-03-23T18:34:57.919 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62906.log 2026-03-23T18:34:57.920 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58986.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58986.log.gz 2026-03-23T18:34:57.920 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.25705.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25705.log.gz 2026-03-23T18:34:57.920 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87961.log 2026-03-23T18:34:57.920 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60413.log 2026-03-23T18:34:57.920 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62906.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62906.log.gz 2026-03-23T18:34:57.921 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87961.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58814.log 2026-03-23T18:34:57.921 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87961.log.gz 2026-03-23T18:34:57.921 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33062.log 2026-03-23T18:34:57.921 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60413.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60413.log.gz 2026-03-23T18:34:57.921 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58814.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58814.log.gz 2026-03-23T18:34:57.921 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42405.log 2026-03-23T18:34:57.922 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82677.log 2026-03-23T18:34:57.922 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33062.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33062.log.gz 2026-03-23T18:34:57.922 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42405.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58423.log 2026-03-23T18:34:57.922 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42405.log.gz 2026-03-23T18:34:57.922 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74103.log 2026-03-23T18:34:57.922 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82677.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84551.log 2026-03-23T18:34:57.922 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82677.log.gz 2026-03-23T18:34:57.923 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74103.log: /var/log/ceph/ceph-client.admin.58423.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58423.log.gz 2026-03-23T18:34:57.923 INFO:teuthology.orchestra.run.vm04.stderr: -- replaced with /var/log/ceph/ceph-client.admin.74103.log.gz 2026-03-23T18:34:57.923 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51050.log 2026-03-23T18:34:57.923 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58608.log 2026-03-23T18:34:57.923 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84551.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84551.log.gz 2026-03-23T18:34:57.923 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.51050.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84680.log 2026-03-23T18:34:57.924 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51050.log.gz 2026-03-23T18:34:57.924 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79751.log 2026-03-23T18:34:57.924 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58608.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58608.log.gz 2026-03-23T18:34:57.924 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88232.log 2026-03-23T18:34:57.924 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84680.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84680.log.gz 2026-03-23T18:34:57.924 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79751.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44650.log 2026-03-23T18:34:57.924 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79751.log.gz 2026-03-23T18:34:57.924 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88290.log 2026-03-23T18:34:57.925 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88232.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88232.log.gz 2026-03-23T18:34:57.925 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65690.log 2026-03-23T18:34:57.925 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44650.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44650.log.gz 2026-03-23T18:34:57.925 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88290.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30106.log 2026-03-23T18:34:57.925 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88290.log.gz 2026-03-23T18:34:57.925 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43610.log 2026-03-23T18:34:57.925 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65690.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65690.log.gz 2026-03-23T18:34:57.925 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49385.log 2026-03-23T18:34:57.926 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.30106.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30106.log.gz 2026-03-23T18:34:57.926 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43610.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57972.log 2026-03-23T18:34:57.926 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49385.log: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.43610.log.gz 2026-03-23T18:34:57.926 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49385.log.gz 2026-03-23T18:34:57.926 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41953.log 2026-03-23T18:34:57.926 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.57972.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45777.log 2026-03-23T18:34:57.927 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57972.log.gz 2026-03-23T18:34:57.927 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29204.log 2026-03-23T18:34:57.927 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.41953.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41953.log.gz 2026-03-23T18:34:57.927 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58461.log 2026-03-23T18:34:57.927 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.45777.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45777.log.gz 2026-03-23T18:34:57.927 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29204.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79482.log 2026-03-23T18:34:57.927 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29204.log.gz 2026-03-23T18:34:57.928 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31486.log 2026-03-23T18:34:57.928 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58461.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58461.log.gz 2026-03-23T18:34:57.928 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47997.log 2026-03-23T18:34:57.928 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79482.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79482.log.gz 2026-03-23T18:34:57.928 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31486.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87380.log 2026-03-23T18:34:57.928 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31486.log.gz 2026-03-23T18:34:57.928 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79938.log 2026-03-23T18:34:57.928 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.47997.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47997.log.gz 2026-03-23T18:34:57.929 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87380.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87380.log.gz 2026-03-23T18:34:57.929 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56828.log 2026-03-23T18:34:57.929 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54625.log 2026-03-23T18:34:57.929 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79938.log: /var/log/ceph/ceph-client.admin.56828.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33453.log 2026-03-23T18:34:57.929 INFO:teuthology.orchestra.run.vm04.stderr: 51.7% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79938.log.gz 2026-03-23T18:34:57.929 INFO:teuthology.orchestra.run.vm04.stderr: -- replaced with /var/log/ceph/ceph-client.admin.56828.log.gz 2026-03-23T18:34:57.930 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.54625.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74753.log 2026-03-23T18:34:57.930 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54625.log.gz 2026-03-23T18:34:57.930 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57016.log 2026-03-23T18:34:57.930 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33453.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33453.log.gz 2026-03-23T18:34:57.930 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74753.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50353.log 2026-03-23T18:34:57.931 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74753.log.gz 2026-03-23T18:34:57.931 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26548.log 2026-03-23T18:34:57.931 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.57016.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57016.log.gz 2026-03-23T18:34:57.931 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50353.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50353.log.gz 2026-03-23T18:34:57.931 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69812.log 2026-03-23T18:34:57.931 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83086.log 2026-03-23T18:34:57.932 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.26548.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26548.log.gz 2026-03-23T18:34:57.932 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69812.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57391.log 2026-03-23T18:34:57.932 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69812.log.gz 2026-03-23T18:34:57.932 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65422.log 2026-03-23T18:34:57.932 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83086.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83086.log.gz 2026-03-23T18:34:57.932 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74595.log 2026-03-23T18:34:57.932 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.57391.log: 27.1% -- replaced with /var/log/ceph/ceph-client.admin.57391.log.gz 2026-03-23T18:34:57.932 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65422.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51325.log 2026-03-23T18:34:57.933 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65422.log.gz 2026-03-23T18:34:57.933 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63802.log 2026-03-23T18:34:57.933 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74595.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74595.log.gz 2026-03-23T18:34:57.933 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.51325.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51325.log.gz 2026-03-23T18:34:57.933 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56140.log 2026-03-23T18:34:57.933 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58223.log 2026-03-23T18:34:57.934 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63802.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63802.log.gz 2026-03-23T18:34:57.934 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56140.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35920.log 2026-03-23T18:34:57.934 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56140.log.gz 2026-03-23T18:34:57.934 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69686.log 2026-03-23T18:34:57.934 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58223.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58223.log.gz 2026-03-23T18:34:57.934 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.35920.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35920.log.gz 2026-03-23T18:34:57.935 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30755.log 2026-03-23T18:34:57.935 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60864.log 2026-03-23T18:34:57.935 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69686.log: /var/log/ceph/ceph-client.admin.30755.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85526.log 2026-03-23T18:34:57.935 INFO:teuthology.orchestra.run.vm04.stderr: 27.2% -- replaced with /var/log/ceph/ceph-client.admin.69686.log.gz 2026-03-23T18:34:57.935 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30755.log.gz 2026-03-23T18:34:57.935 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60864.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39135.log 2026-03-23T18:34:57.935 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60864.log.gz 2026-03-23T18:34:57.935 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86213.log 2026-03-23T18:34:57.936 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85526.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85526.log.gz 2026-03-23T18:34:57.936 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81438.log 2026-03-23T18:34:57.936 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.39135.log: /var/log/ceph/ceph-client.admin.86213.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28069.log 2026-03-23T18:34:57.936 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86213.log.gz 2026-03-23T18:34:57.936 INFO:teuthology.orchestra.run.vm04.stderr: 25.0% -- replaced with /var/log/ceph/ceph-client.admin.39135.log.gz 2026-03-23T18:34:57.936 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81438.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81438.log.gz 2026-03-23T18:34:57.936 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38778.log 2026-03-23T18:34:57.937 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63434.log 2026-03-23T18:34:57.937 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.28069.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28069.log.gz 2026-03-23T18:34:57.937 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47506.log 2026-03-23T18:34:57.937 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.38778.log: /var/log/ceph/ceph-client.admin.63434.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88410.log 2026-03-23T18:34:57.937 INFO:teuthology.orchestra.run.vm04.stderr: 26.1% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63434.log.gz 2026-03-23T18:34:57.937 INFO:teuthology.orchestra.run.vm04.stderr: -- replaced with /var/log/ceph/ceph-client.admin.38778.log.gz 2026-03-23T18:34:57.937 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.47506.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47506.log.gz 2026-03-23T18:34:57.938 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31357.log 2026-03-23T18:34:57.938 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61875.log 2026-03-23T18:34:57.938 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88410.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88410.log.gz 2026-03-23T18:34:57.938 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53890.log 2026-03-23T18:34:57.938 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31357.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31357.log.gz 2026-03-23T18:34:57.938 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61875.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34279.log 2026-03-23T18:34:57.938 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61875.log.gz 2026-03-23T18:34:57.939 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40365.log 2026-03-23T18:34:57.939 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.53890.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53890.log.gz 2026-03-23T18:34:57.939 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58015.log 2026-03-23T18:34:57.939 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34279.log: /var/log/ceph/ceph-client.admin.40365.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75396.log 2026-03-23T18:34:57.939 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40365.log.gz 2026-03-23T18:34:57.939 INFO:teuthology.orchestra.run.vm04.stderr: 25.3% -- replaced with /var/log/ceph/ceph-client.admin.34279.log.gz 2026-03-23T18:34:57.939 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58015.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58015.log.gz 2026-03-23T18:34:57.940 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71264.log 2026-03-23T18:34:57.940 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66919.log 2026-03-23T18:34:57.940 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75396.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75396.log.gz 2026-03-23T18:34:57.940 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39570.log 2026-03-23T18:34:57.940 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71264.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71264.log.gz 2026-03-23T18:34:57.940 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5/var/log/ceph/ceph-client.admin.66919.log: --verbose -- /var/log/ceph/ceph-client.admin.69275.log 2026-03-23T18:34:57.940 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66919.log.gz 2026-03-23T18:34:57.940 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26763.log 2026-03-23T18:34:57.941 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.39570.log: /var/log/ceph/ceph-client.admin.69275.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64969.log 2026-03-23T18:34:57.941 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69275.log.gz 26.1% -- replaced with /var/log/ceph/ceph-client.admin.39570.log.gz 2026-03-23T18:34:57.941 INFO:teuthology.orchestra.run.vm04.stderr: 2026-03-23T18:34:57.941 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.26763.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26763.log.gz 2026-03-23T18:34:57.941 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30515.log 2026-03-23T18:34:57.941 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55323.log 2026-03-23T18:34:57.941 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64969.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64969.log.gz 2026-03-23T18:34:57.941 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64005.log 2026-03-23T18:34:57.942 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.30515.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30515.log.gz 2026-03-23T18:34:57.942 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.55323.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27284.log 2026-03-23T18:34:57.942 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55323.log.gz 2026-03-23T18:34:57.942 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54301.log 2026-03-23T18:34:57.942 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64005.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64005.log.gz 2026-03-23T18:34:57.942 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46827.log 2026-03-23T18:34:57.942 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27284.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27284.log.gz 2026-03-23T18:34:57.942 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.54301.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51070.log 2026-03-23T18:34:57.942 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54301.log.gz 2026-03-23T18:34:57.943 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46827.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46827.log.gz 2026-03-23T18:34:57.943 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54829.log 2026-03-23T18:34:57.943 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25602.log 2026-03-23T18:34:57.943 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.51070.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51070.log.gz 2026-03-23T18:34:57.943 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33096.log 2026-03-23T18:34:57.943 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.54829.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54829.log.gz 2026-03-23T18:34:57.943 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.25602.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53611.log 2026-03-23T18:34:57.943 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25602.log.gz 2026-03-23T18:34:57.944 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44571.log 2026-03-23T18:34:57.944 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33096.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33096.log.gz 2026-03-23T18:34:57.944 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75532.log 2026-03-23T18:34:57.944 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.53611.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53611.log.gz 2026-03-23T18:34:57.944 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44571.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81845.log 2026-03-23T18:34:57.944 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35818.log 2026-03-23T18:34:57.944 INFO:teuthology.orchestra.run.vm04.stderr: 25.8% -- replaced with /var/log/ceph/ceph-client.admin.44571.log.gz 2026-03-23T18:34:57.944 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75532.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75532.log.gz 2026-03-23T18:34:57.945 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81845.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81845.log.gz 2026-03-23T18:34:57.945 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68415.log 2026-03-23T18:34:57.945 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46099.log 2026-03-23T18:34:57.945 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.35818.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35818.log.gz 2026-03-23T18:34:57.945 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68415.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65614.log 2026-03-23T18:34:57.945 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68415.log.gz 2026-03-23T18:34:57.945 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46099.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45022.log 2026-03-23T18:34:57.945 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46099.log.gz 2026-03-23T18:34:57.945 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55816.log 2026-03-23T18:34:57.946 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65614.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65614.log.gz 2026-03-23T18:34:57.946 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.45022.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47463.log 2026-03-23T18:34:57.946 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45022.log.gz 2026-03-23T18:34:57.946 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69849.log 2026-03-23T18:34:57.946 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.55816.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55816.log.gz 2026-03-23T18:34:57.946 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31979.log 2026-03-23T18:34:57.946 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.47463.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47463.log.gz 2026-03-23T18:34:57.947 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69849.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87184.log 2026-03-23T18:34:57.947 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69849.log.gz 2026-03-23T18:34:57.947 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51927.log 2026-03-23T18:34:57.947 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31979.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31979.log.gz 2026-03-23T18:34:57.947 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61466.log 2026-03-23T18:34:57.947 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87184.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87184.log.gz 2026-03-23T18:34:57.947 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.51927.log: gzip 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51927.log.gz -5 2026-03-23T18:34:57.947 INFO:teuthology.orchestra.run.vm04.stderr: --verbose -- /var/log/ceph/ceph-client.admin.84831.log 2026-03-23T18:34:57.947 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61466.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40976.log 2026-03-23T18:34:57.947 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61466.log.gz 2026-03-23T18:34:57.948 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44851.log 2026-03-23T18:34:57.948 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84831.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84831.log.gz 2026-03-23T18:34:57.948 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.40976.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.40976.log.gz -5 2026-03-23T18:34:57.948 INFO:teuthology.orchestra.run.vm04.stderr: --verbose -- /var/log/ceph/ceph-client.admin.50224.log 2026-03-23T18:34:57.948 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88111.log 2026-03-23T18:34:57.948 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44851.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44851.log.gz 2026-03-23T18:34:57.948 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59588.log 2026-03-23T18:34:57.948 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50224.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50224.log.gz 2026-03-23T18:34:57.949 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88111.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46758.log 2026-03-23T18:34:57.949 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88111.log.gz 2026-03-23T18:34:57.949 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68522.log 2026-03-23T18:34:57.949 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59588.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59588.log.gz 2026-03-23T18:34:57.949 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73194.log 2026-03-23T18:34:57.949 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46758.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46758.log.gz 2026-03-23T18:34:57.949 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68522.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85870.log 2026-03-23T18:34:57.949 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68522.log.gz 2026-03-23T18:34:57.949 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72173.log 2026-03-23T18:34:57.950 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73194.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73194.log.gz 2026-03-23T18:34:57.950 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46013.log 2026-03-23T18:34:57.950 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85870.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85870.log.gz 2026-03-23T18:34:57.950 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72173.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63725.log 2026-03-23T18:34:57.950 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72173.log.gz 2026-03-23T18:34:57.950 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82484.log 2026-03-23T18:34:57.950 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46013.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46013.log.gz 2026-03-23T18:34:57.950 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34783.log 2026-03-23T18:34:57.951 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63725.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63725.log.gz 2026-03-23T18:34:57.951 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82484.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62486.log 2026-03-23T18:34:57.951 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82484.log.gz 2026-03-23T18:34:57.951 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30850.log 2026-03-23T18:34:57.951 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34783.log: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.34783.log.gz 2026-03-23T18:34:57.951 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87509.log 2026-03-23T18:34:57.951 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62486.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62486.log.gz 2026-03-23T18:34:57.951 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.30850.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30850.log.gz 2026-03-23T18:34:57.952 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73129.log 2026-03-23T18:34:57.952 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30149.log 2026-03-23T18:34:57.952 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87509.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87509.log.gz 2026-03-23T18:34:57.952 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81245.log 2026-03-23T18:34:57.952 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73129.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73129.log.gz 2026-03-23T18:34:57.952 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.30149.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80597.log 2026-03-23T18:34:57.952 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30149.log.gz 2026-03-23T18:34:57.952 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59783.log 2026-03-23T18:34:57.952 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81245.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81245.log.gz 2026-03-23T18:34:57.953 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66203.log 2026-03-23T18:34:57.953 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80597.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80597.log.gz 2026-03-23T18:34:57.953 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59783.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25454.log 2026-03-23T18:34:57.953 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59783.log.gz 2026-03-23T18:34:57.953 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41786.log 2026-03-23T18:34:57.953 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66203.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66203.log.gz 2026-03-23T18:34:57.953 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26483.log 2026-03-23T18:34:57.953 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.25454.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25454.log.gz 2026-03-23T18:34:57.954 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.41786.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85461.log 2026-03-23T18:34:57.954 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41786.log.gz 2026-03-23T18:34:57.954 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35560.log 2026-03-23T18:34:57.954 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.26483.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26483.log.gz 2026-03-23T18:34:57.954 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54995.log 2026-03-23T18:34:57.954 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85461.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85461.log.gz 2026-03-23T18:34:57.954 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.35560.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51884.log 2026-03-23T18:34:57.954 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86689.log 2026-03-23T18:34:57.954 INFO:teuthology.orchestra.run.vm04.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.35560.log.gz 2026-03-23T18:34:57.954 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.54995.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54995.log.gz 2026-03-23T18:34:57.955 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.51884.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51884.log.gz 2026-03-23T18:34:57.955 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35350.log 2026-03-23T18:34:57.955 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57328.log 2026-03-23T18:34:57.955 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86689.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86689.log.gz 2026-03-23T18:34:57.955 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.35350.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60239.log 2026-03-23T18:34:57.955 INFO:teuthology.orchestra.run.vm04.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.35350.log.gz 2026-03-23T18:34:57.955 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.57328.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57328.log.gz 2026-03-23T18:34:57.956 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46293.log 2026-03-23T18:34:57.956 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60239.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74964.log 2026-03-23T18:34:57.956 INFO:teuthology.orchestra.run.vm04.stderr: 10.7% -- replaced with /var/log/ceph/ceph-client.admin.60239.log.gz 2026-03-23T18:34:57.956 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50902.log 2026-03-23T18:34:57.956 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46293.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46293.log.gz 2026-03-23T18:34:57.956 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47917.log 2026-03-23T18:34:57.956 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74964.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74964.log.gz 2026-03-23T18:34:57.956 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50902.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26655.log 2026-03-23T18:34:57.956 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50902.log.gz 2026-03-23T18:34:57.957 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76174.log 2026-03-23T18:34:57.957 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.47917.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47917.log.gz 2026-03-23T18:34:57.957 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46250.log 2026-03-23T18:34:57.957 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.26655.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26655.log.gz 2026-03-23T18:34:57.957 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76174.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41893.log 2026-03-23T18:34:57.957 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76174.log.gz 2026-03-23T18:34:57.957 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86020.log 2026-03-23T18:34:57.957 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46250.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46250.log.gz 2026-03-23T18:34:57.958 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75251.log 2026-03-23T18:34:57.958 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.41893.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41893.log.gz 2026-03-23T18:34:57.958 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86020.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29891.log 2026-03-23T18:34:57.958 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86020.log.gz 2026-03-23T18:34:57.958 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40736.log 2026-03-23T18:34:57.958 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75251.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75251.log.gz 2026-03-23T18:34:57.958 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74774.log 2026-03-23T18:34:57.959 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.40736.log: /var/log/ceph/ceph-client.admin.29891.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40736.log.gz 2026-03-23T18:34:57.959 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29891.log.gz 2026-03-23T18:34:57.959 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32909.log 2026-03-23T18:34:57.959 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74774.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91307.log 2026-03-23T18:34:57.959 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74774.log.gz 2026-03-23T18:34:57.959 INFO:teuthology.orchestra.run.vm04.stderr:gzip/var/log/ceph/ceph-client.admin.32909.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.37598.log 2026-03-23T18:34:57.959 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.22862.log 2026-03-23T18:34:57.959 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32909.log.gz 2026-03-23T18:34:57.960 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91307.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91307.log.gz 2026-03-23T18:34:57.960 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37598.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59007.log 2026-03-23T18:34:57.960 INFO:teuthology.orchestra.run.vm04.stderr: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.37598.log.gz 2026-03-23T18:34:57.960 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44251.log 2026-03-23T18:34:57.960 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.22862.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.22862.log.gz 2026-03-23T18:34:57.960 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59007.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86238.log 2026-03-23T18:34:57.960 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59007.log.gz 2026-03-23T18:34:57.960 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69969.log 2026-03-23T18:34:57.961 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44251.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44251.log.gz 2026-03-23T18:34:57.961 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58403.log 2026-03-23T18:34:57.961 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86238.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86238.log.gz 2026-03-23T18:34:57.961 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69969.log: gzip -5 --verbose 0.0% -- /var/log/ceph/ceph-client.admin.86582.log -- replaced with /var/log/ceph/ceph-client.admin.69969.log.gz 2026-03-23T18:34:57.961 INFO:teuthology.orchestra.run.vm04.stderr: 2026-03-23T18:34:57.961 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67070.log 2026-03-23T18:34:57.961 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58403.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58403.log.gz 2026-03-23T18:34:57.961 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79404.log 2026-03-23T18:34:57.961 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86582.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86582.log.gz 2026-03-23T18:34:57.962 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67070.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31808.log 2026-03-23T18:34:57.962 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67070.log.gz 2026-03-23T18:34:57.962 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29419.log 2026-03-23T18:34:57.962 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79404.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79404.log.gz 2026-03-23T18:34:57.962 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30321.log 2026-03-23T18:34:57.962 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31808.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31808.log.gz 2026-03-23T18:34:57.962 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29419.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67877.log 2026-03-23T18:34:57.962 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29419.log.gz 2026-03-23T18:34:57.962 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78849.log 2026-03-23T18:34:57.963 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.30321.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30321.log.gz 2026-03-23T18:34:57.963 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27626.log 2026-03-23T18:34:57.963 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67877.log: /var/log/ceph/ceph-client.admin.78849.log: 58.2% -- replaced with /var/log/ceph/ceph-client.admin.67877.log.gz 2026-03-23T18:34:57.963 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63180.log 2026-03-23T18:34:57.963 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78849.log.gz 2026-03-23T18:34:57.963 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27626.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57606.log 2026-03-23T18:34:57.963 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27626.log.gz 2026-03-23T18:34:57.963 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36668.log 2026-03-23T18:34:57.963 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63180.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63180.log.gz 2026-03-23T18:34:57.964 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.57606.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71780.log 2026-03-23T18:34:57.964 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57606.log.gz 2026-03-23T18:34:57.964 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31996.log 2026-03-23T18:34:57.964 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36668.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36668.log.gz 2026-03-23T18:34:57.964 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79680.log 2026-03-23T18:34:57.964 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71780.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71780.log.gz 2026-03-23T18:34:57.964 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31996.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47399.log 2026-03-23T18:34:57.964 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31996.log.gz 2026-03-23T18:34:57.964 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85148.log 2026-03-23T18:34:57.965 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79680.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79680.log.gz 2026-03-23T18:34:57.965 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63611.log 2026-03-23T18:34:57.965 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.47399.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47399.log.gz 2026-03-23T18:34:57.965 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85148.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31314.log 2026-03-23T18:34:57.965 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85148.log.gz 2026-03-23T18:34:57.965 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73301.log 2026-03-23T18:34:57.965 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63611.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63611.log.gz 2026-03-23T18:34:57.965 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82935.log 2026-03-23T18:34:57.965 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31314.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31314.log.gz 2026-03-23T18:34:57.966 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73301.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29676.log 2026-03-23T18:34:57.966 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73301.log.gz 2026-03-23T18:34:57.966 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62469.log 2026-03-23T18:34:57.966 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82935.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82935.log.gz 2026-03-23T18:34:57.966 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56972.log 2026-03-23T18:34:57.967 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29676.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29676.log.gz 2026-03-23T18:34:57.967 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40343.log 2026-03-23T18:34:57.967 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62469.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62469.log.gz 2026-03-23T18:34:57.967 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79269.log 2026-03-23T18:34:57.967 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56972.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56972.log.gz 2026-03-23T18:34:57.967 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76521.log 2026-03-23T18:34:57.967 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.40343.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40343.log.gz 2026-03-23T18:34:57.967 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79269.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.79269.log.gz -5 2026-03-23T18:34:57.967 INFO:teuthology.orchestra.run.vm04.stderr: --verbose -- /var/log/ceph/ceph-client.admin.58854.log 2026-03-23T18:34:57.968 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35623.log 2026-03-23T18:34:57.968 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76521.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76521.log.gz 2026-03-23T18:34:57.968 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33521.log 2026-03-23T18:34:57.968 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58854.log: /var/log/ceph/ceph-client.admin.35623.log: 26.7%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57821.log 2026-03-23T18:34:57.968 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35623.log.gz 2026-03-23T18:34:57.968 INFO:teuthology.orchestra.run.vm04.stderr: -- replaced with /var/log/ceph/ceph-client.admin.58854.log.gz 2026-03-23T18:34:57.968 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33521.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63922.log 2026-03-23T18:34:57.968 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33521.log.gz 2026-03-23T18:34:57.968 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89336.log 2026-03-23T18:34:57.968 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.57821.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57821.log.gz 2026-03-23T18:34:57.968 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63922.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43812.log 2026-03-23T18:34:57.969 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63922.log.gz 2026-03-23T18:34:57.969 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60907.log 2026-03-23T18:34:57.969 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89336.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89336.log.gz 2026-03-23T18:34:57.969 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26677.log 2026-03-23T18:34:57.969 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43812.log: /var/log/ceph/ceph-client.admin.60907.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48527.log 2026-03-23T18:34:57.969 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60907.log.gz 2026-03-23T18:34:57.969 INFO:teuthology.orchestra.run.vm04.stderr: 2.5% -- replaced with /var/log/ceph/ceph-client.admin.43812.log.gz 2026-03-23T18:34:57.969 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.26677.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30343.log 2026-03-23T18:34:57.970 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26677.log.gz 2026-03-23T18:34:57.970 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44131.log 2026-03-23T18:34:57.970 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.48527.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48527.log.gz 2026-03-23T18:34:57.970 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.30343.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44550.log 2026-03-23T18:34:57.970 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30343.log.gz 2026-03-23T18:34:57.970 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38799.log 2026-03-23T18:34:57.970 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44131.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44131.log.gz 2026-03-23T18:34:57.970 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87853.log 2026-03-23T18:34:57.970 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44550.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44550.log.gz 2026-03-23T18:34:57.971 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.38799.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50310.log 2026-03-23T18:34:57.971 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87853.log: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.38799.log.gz 2026-03-23T18:34:57.971 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87853.log.gz 2026-03-23T18:34:57.971 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74302.log 2026-03-23T18:34:57.971 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50310.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55386.log 2026-03-23T18:34:57.971 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50310.log.gz 2026-03-23T18:34:57.971 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43174.log 2026-03-23T18:34:57.972 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74302.log: 30.3% -- replaced with /var/log/ceph/ceph-client.admin.74302.log.gz 2026-03-23T18:34:57.972 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72893.log 2026-03-23T18:34:57.972 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.55386.log: 26.4%/var/log/ceph/ceph-client.admin.43174.log: -- replaced with /var/log/ceph/ceph-client.admin.55386.log.gz 2026-03-23T18:34:57.972 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31121.log 2026-03-23T18:34:57.972 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43174.log.gz 2026-03-23T18:34:57.972 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72893.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72893.log.gz 2026-03-23T18:34:57.972 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50820.log 2026-03-23T18:34:57.973 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37934.log 2026-03-23T18:34:57.973 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31121.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31121.log.gz 2026-03-23T18:34:57.973 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36345.log 2026-03-23T18:34:57.973 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50820.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50820.log.gz 2026-03-23T18:34:57.973 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37934.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84159.log 2026-03-23T18:34:57.973 INFO:teuthology.orchestra.run.vm04.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.37934.log.gz 2026-03-23T18:34:57.973 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36345.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36345.log.gz 2026-03-23T18:34:57.973 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71673.log 2026-03-23T18:34:57.974 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84159.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60257.log 2026-03-23T18:34:57.974 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84159.log.gz 2026-03-23T18:34:57.974 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31400.log 2026-03-23T18:34:57.974 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71673.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71673.log.gz 2026-03-23T18:34:57.974 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61165.log 2026-03-23T18:34:57.974 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60257.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60257.log.gz 2026-03-23T18:34:57.975 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31400.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31400.log.gz 2026-03-23T18:34:57.975 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33164.log 2026-03-23T18:34:57.975 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61165.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81743.log 2026-03-23T18:34:57.975 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61165.log.gz 2026-03-23T18:34:57.975 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43072.log 2026-03-23T18:34:57.975 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33164.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33164.log.gz 2026-03-23T18:34:57.975 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81743.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78674.log 2026-03-23T18:34:57.975 INFO:teuthology.orchestra.run.vm04.stderr: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.81743.log.gz 2026-03-23T18:34:57.975 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31185.log 2026-03-23T18:34:57.976 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43072.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43072.log.gz 2026-03-23T18:34:57.976 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33436.log 2026-03-23T18:34:57.976 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78674.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78674.log.gz 2026-03-23T18:34:57.976 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31185.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83000.log 2026-03-23T18:34:57.976 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31185.log.gz 2026-03-23T18:34:57.976 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34804.log 2026-03-23T18:34:57.976 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33436.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33436.log.gz 2026-03-23T18:34:57.976 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59883.log 2026-03-23T18:34:57.976 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83000.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83000.log.gz 2026-03-23T18:34:57.976 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34804.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74871.log 2026-03-23T18:34:57.977 INFO:teuthology.orchestra.run.vm04.stderr: 25.4% -- replaced with /var/log/ceph/ceph-client.admin.34804.log.gz 2026-03-23T18:34:57.977 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59883.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59883.log.gz 2026-03-23T18:34:57.977 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90603.log 2026-03-23T18:34:57.977 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74946.log 2026-03-23T18:34:57.977 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74871.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74871.log.gz 2026-03-23T18:34:57.977 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91593.log 2026-03-23T18:34:57.977 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90603.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90603.log.gz 2026-03-23T18:34:57.977 INFO:teuthology.orchestra.run.vm04.stderr:gzip/var/log/ceph/ceph-client.admin.74946.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.49621.log 2026-03-23T18:34:57.978 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74946.log.gz 2026-03-23T18:34:57.978 INFO:teuthology.orchestra.run.vm04.stderr:gzip/var/log/ceph/ceph-client.admin.91593.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.85244.log 2026-03-23T18:34:57.978 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91593.log.gz 2026-03-23T18:34:57.978 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48946.log 2026-03-23T18:34:57.978 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49621.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49621.log.gz 2026-03-23T18:34:57.978 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85762.log 2026-03-23T18:34:57.978 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85244.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85244.log.gz 2026-03-23T18:34:57.978 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.48946.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74577.log 2026-03-23T18:34:57.979 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34384.log 2026-03-23T18:34:57.979 INFO:teuthology.orchestra.run.vm04.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.48946.log.gz 2026-03-23T18:34:57.979 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85762.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85762.log.gz 2026-03-23T18:34:57.979 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74577.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74577.log.gz 2026-03-23T18:34:57.979 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91285.log 2026-03-23T18:34:57.979 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75959.log 2026-03-23T18:34:57.979 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34384.log: /var/log/ceph/ceph-client.admin.91285.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30386.log 2026-03-23T18:34:57.979 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91285.log.gz 2026-03-23T18:34:57.979 INFO:teuthology.orchestra.run.vm04.stderr: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.34384.log.gz 2026-03-23T18:34:57.980 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75959.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75959.log.gz 2026-03-23T18:34:57.980 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64451.log 2026-03-23T18:34:57.980 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73689.log 2026-03-23T18:34:57.980 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.30386.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30386.log.gz 2026-03-23T18:34:57.980 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64451.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67440.log 2026-03-23T18:34:57.980 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64451.log.gz 2026-03-23T18:34:57.980 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73689.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86324.log 2026-03-23T18:34:57.981 INFO:teuthology.orchestra.run.vm04.stderr: 29.4% -- replaced with /var/log/ceph/ceph-client.admin.73689.log.gz 2026-03-23T18:34:57.981 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57413.log 2026-03-23T18:34:57.981 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67440.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67440.log.gz 2026-03-23T18:34:57.981 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86324.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55580.log 2026-03-23T18:34:57.981 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86324.log.gz 2026-03-23T18:34:57.981 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64715.log 2026-03-23T18:34:57.981 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.57413.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57413.log.gz 2026-03-23T18:34:57.981 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51626.log 2026-03-23T18:34:57.981 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.55580.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55580.log.gz 2026-03-23T18:34:57.982 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64715.log: gzip -5 0.0% --verbose -- -- replaced with /var/log/ceph/ceph-client.admin.64715.log.gz /var/log/ceph/ceph-client.admin.49320.log 2026-03-23T18:34:57.982 INFO:teuthology.orchestra.run.vm04.stderr: 2026-03-23T18:34:57.982 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.51626.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63689.log 2026-03-23T18:34:57.982 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51626.log.gz 2026-03-23T18:34:57.982 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73172.log 2026-03-23T18:34:57.982 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49320.log: /var/log/ceph/ceph-client.admin.63689.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63689.log.gz 2026-03-23T18:34:57.982 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43792.log 2026-03-23T18:34:57.982 INFO:teuthology.orchestra.run.vm04.stderr: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.49320.log.gz 2026-03-23T18:34:57.983 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73172.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73344.log 2026-03-23T18:34:57.983 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73172.log.gz 2026-03-23T18:34:57.983 INFO:teuthology.orchestra.run.vm04.stderr:gzip/var/log/ceph/ceph-client.admin.43792.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.48485.log 2026-03-23T18:34:57.983 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43792.log.gz 2026-03-23T18:34:57.983 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80111.log 2026-03-23T18:34:57.983 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73344.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73344.log.gz 2026-03-23T18:34:57.983 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33334.log 2026-03-23T18:34:57.983 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.48485.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48485.log.gz 2026-03-23T18:34:57.983 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80111.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40879.log 2026-03-23T18:34:57.983 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80111.log.gz 2026-03-23T18:34:57.984 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86085.log 2026-03-23T18:34:57.984 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33334.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33334.log.gz 2026-03-23T18:34:57.984 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46465.log 2026-03-23T18:34:57.984 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.40879.log: /var/log/ceph/ceph-client.admin.86085.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60950.log 2026-03-23T18:34:57.984 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86085.log.gz 2026-03-23T18:34:57.984 INFO:teuthology.orchestra.run.vm04.stderr: 66.7% -- replaced with /var/log/ceph/ceph-client.admin.40879.log.gz 2026-03-23T18:34:57.984 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46465.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46465.log.gz 2026-03-23T18:34:57.985 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82165.log 2026-03-23T18:34:57.985 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83579.log 2026-03-23T18:34:57.985 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60950.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60950.log.gz 2026-03-23T18:34:57.985 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82165.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39423.log 2026-03-23T18:34:57.985 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82165.log.gz 2026-03-23T18:34:57.985 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83579.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47248.log 2026-03-23T18:34:57.985 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83579.log.gz 2026-03-23T18:34:57.985 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73580.log 2026-03-23T18:34:57.986 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.39423.log: /var/log/ceph/ceph-client.admin.47248.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66265.log 2026-03-23T18:34:57.986 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47248.log.gz 2026-03-23T18:34:57.986 INFO:teuthology.orchestra.run.vm04.stderr: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.39423.log.gz 2026-03-23T18:34:57.986 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73580.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.73580.log.gz -5 --verbose -- /var/log/ceph/ceph-client.admin.70945.log 2026-03-23T18:34:57.986 INFO:teuthology.orchestra.run.vm04.stderr: 2026-03-23T18:34:57.986 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66265.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66265.log.gz 2026-03-23T18:34:57.987 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39156.log 2026-03-23T18:34:57.987 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70945.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46564.log 2026-03-23T18:34:57.987 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70945.log.gz 2026-03-23T18:34:57.987 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36702.log 2026-03-23T18:34:57.987 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.39156.log: 26.2% -- replaced with /var/log/ceph/ceph-client.admin.39156.log.gz 2026-03-23T18:34:57.987 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77304.log 2026-03-23T18:34:57.987 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46564.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46564.log.gz 2026-03-23T18:34:57.988 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36702.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36702.log.gz 2026-03-23T18:34:57.988 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37808.log 2026-03-23T18:34:57.988 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83021.log 2026-03-23T18:34:57.988 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77304.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77304.log.gz 2026-03-23T18:34:57.988 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38988.log 2026-03-23T18:34:57.988 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37808.log: /var/log/ceph/ceph-client.admin.83021.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69253.log 2026-03-23T18:34:57.988 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83021.log.gz 2026-03-23T18:34:57.988 INFO:teuthology.orchestra.run.vm04.stderr: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.37808.log.gz 2026-03-23T18:34:57.989 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.38988.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.38988.log.gz 2026-03-23T18:34:57.989 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81777.log 2026-03-23T18:34:57.989 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34615.log 2026-03-23T18:34:57.989 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69253.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69253.log.gz 2026-03-23T18:34:57.989 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35903.log 2026-03-23T18:34:57.989 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81777.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81777.log.gz 2026-03-23T18:34:57.990 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34615.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72276.log 2026-03-23T18:34:57.990 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34615.log.gz 2026-03-23T18:34:57.990 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.35903.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35903.log.gz 2026-03-23T18:34:57.990 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52033.log 2026-03-23T18:34:57.990 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33817.log 2026-03-23T18:34:57.991 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72276.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27342.log 2026-03-23T18:34:57.991 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.52033.log: 58.2% -- replaced with /var/log/ceph/ceph-client.admin.72276.log.gz 2026-03-23T18:34:57.991 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52033.log.gz 2026-03-23T18:34:57.991 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33817.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40956.log 2026-03-23T18:34:57.991 INFO:teuthology.orchestra.run.vm04.stderr: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.33817.log.gz 2026-03-23T18:34:57.991 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33964.log 2026-03-23T18:34:57.991 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27342.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27342.log.gz 2026-03-23T18:34:57.991 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54084.log 2026-03-23T18:34:57.991 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.40956.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40956.log.gz 2026-03-23T18:34:57.992 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63765.log/var/log/ceph/ceph-client.admin.33964.log: 2026-03-23T18:34:57.992 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57058.log 2026-03-23T18:34:57.992 INFO:teuthology.orchestra.run.vm04.stderr: 26.8% -- replaced with /var/log/ceph/ceph-client.admin.33964.log.gz 2026-03-23T18:34:57.992 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.54084.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54084.log.gz 2026-03-23T18:34:57.992 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63765.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63765.log.gz 2026-03-23T18:34:57.992 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45301.log 2026-03-23T18:34:57.992 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42439.log 2026-03-23T18:34:57.992 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.57058.log: /var/log/ceph/ceph-client.admin.45301.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57058.log.gz 2026-03-23T18:34:57.993 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40407.log 2026-03-23T18:34:57.993 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45301.log.gz 2026-03-23T18:34:57.993 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42439.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33985.log 2026-03-23T18:34:57.993 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31465.log 2026-03-23T18:34:57.993 INFO:teuthology.orchestra.run.vm04.stderr: 25.4% -- replaced with /var/log/ceph/ceph-client.admin.42439.log.gz 2026-03-23T18:34:57.993 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.40407.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40407.log.gz 2026-03-23T18:34:57.993 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33985.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47313.log 2026-03-23T18:34:57.993 INFO:teuthology.orchestra.run.vm04.stderr: 25.4% -- replaced with /var/log/ceph/ceph-client.admin.33985.log.gz 2026-03-23T18:34:57.994 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54484.log 2026-03-23T18:34:57.994 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31465.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31465.log.gz 2026-03-23T18:34:57.994 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38337.log 2026-03-23T18:34:57.994 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.47313.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47313.log.gz 2026-03-23T18:34:57.994 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.54484.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45863.log 2026-03-23T18:34:57.994 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54484.log.gz 2026-03-23T18:34:57.995 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.38337.log: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.38337.log.gz 2026-03-23T18:34:57.995 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57434.log 2026-03-23T18:34:57.995 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90110.log 2026-03-23T18:34:57.995 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.45863.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45863.log.gz 2026-03-23T18:34:57.995 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50611.log 2026-03-23T18:34:57.995 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.57434.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57434.log.gz 2026-03-23T18:34:57.995 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90110.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69645.log 2026-03-23T18:34:57.995 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90110.log.gz 2026-03-23T18:34:57.996 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46443.log 2026-03-23T18:34:57.996 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50611.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50611.log.gz 2026-03-23T18:34:57.996 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74083.log 2026-03-23T18:34:57.996 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69645.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69645.log.gz 2026-03-23T18:34:57.996 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46443.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50074.log 2026-03-23T18:34:57.996 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46443.log.gz 2026-03-23T18:34:57.996 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70108.log 2026-03-23T18:34:57.996 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74083.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74083.log.gz 2026-03-23T18:34:57.996 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50692.log 2026-03-23T18:34:57.997 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50074.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50074.log.gz 2026-03-23T18:34:57.997 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70108.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86496.log 2026-03-23T18:34:57.997 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70108.log.gz 2026-03-23T18:34:57.997 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58648.log 2026-03-23T18:34:57.997 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50692.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50692.log.gz 2026-03-23T18:34:57.997 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67582.log 2026-03-23T18:34:57.997 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86496.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86496.log.gz 2026-03-23T18:34:57.997 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58648.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35886.log 2026-03-23T18:34:57.997 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58648.log.gz 2026-03-23T18:34:57.997 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85182.log 2026-03-23T18:34:57.998 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67582.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67582.log.gz 2026-03-23T18:34:57.998 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68135.log 2026-03-23T18:34:57.998 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.35886.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35886.log.gz 2026-03-23T18:34:57.998 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85182.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54361.log 2026-03-23T18:34:57.998 INFO:teuthology.orchestra.run.vm04.stderr: 89.8% -- replaced with /var/log/ceph/ceph-client.admin.85182.log.gz 2026-03-23T18:34:57.998 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40679.log 2026-03-23T18:34:57.998 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68135.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68135.log.gz 2026-03-23T18:34:57.999 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.54361.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37195.log 2026-03-23T18:34:57.999 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54361.log.gz 2026-03-23T18:34:57.999 INFO:teuthology.orchestra.run.vm04.stderr:gzip/var/log/ceph/ceph-client.admin.40679.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.81760.log 2026-03-23T18:34:57.999 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68372.log 2026-03-23T18:34:57.999 INFO:teuthology.orchestra.run.vm04.stderr: 66.0% -- replaced with /var/log/ceph/ceph-client.admin.40679.log.gz 2026-03-23T18:34:57.999 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37195.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37195.log.gz 2026-03-23T18:34:57.999 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81760.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81760.log.gz 2026-03-23T18:34:57.999 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78725.log 2026-03-23T18:34:57.999 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71544.log 2026-03-23T18:34:58.000 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68372.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68372.log.gz 2026-03-23T18:34:58.000 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78725.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42202.log 0.0% 2026-03-23T18:34:58.000 INFO:teuthology.orchestra.run.vm04.stderr: -- replaced with /var/log/ceph/ceph-client.admin.78725.log.gz 2026-03-23T18:34:58.000 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71221.log 2026-03-23T18:34:58.000 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71544.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71544.log.gz 2026-03-23T18:34:58.000 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60042.log 2026-03-23T18:34:58.000 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42202.log: /var/log/ceph/ceph-client.admin.71221.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74041.log 2026-03-23T18:34:58.000 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71221.log.gz 2026-03-23T18:34:58.000 INFO:teuthology.orchestra.run.vm04.stderr: 52.8% -- replaced with /var/log/ceph/ceph-client.admin.42202.log.gz 2026-03-23T18:34:58.001 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60042.log: 0.0%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66511.log 2026-03-23T18:34:58.001 INFO:teuthology.orchestra.run.vm04.stderr: -- replaced with /var/log/ceph/ceph-client.admin.60042.log.gz 2026-03-23T18:34:58.001 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55344.log 2026-03-23T18:34:58.001 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74041.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74041.log.gz 2026-03-23T18:34:58.001 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66511.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36277.log 2026-03-23T18:34:58.001 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66511.log.gz 2026-03-23T18:34:58.001 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72871.log 2026-03-23T18:34:58.001 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.55344.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55344.log.gz 2026-03-23T18:34:58.002 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29633.log 2026-03-23T18:34:58.002 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36277.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36277.log.gz 2026-03-23T18:34:58.002 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72871.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35392.log 2026-03-23T18:34:58.002 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72871.log.gz 2026-03-23T18:34:58.002 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83163.log 2026-03-23T18:34:58.002 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29633.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77459.log 2026-03-23T18:34:58.002 INFO:teuthology.orchestra.run.vm04.stderr: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.29633.log.gz 2026-03-23T18:34:58.002 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.35392.log: /var/log/ceph/ceph-client.admin.83163.log: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.35392.log.gz 2026-03-23T18:34:58.002 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83163.log.gz 2026-03-23T18:34:58.003 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43402.log 2026-03-23T18:34:58.003 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80898.log 2026-03-23T18:34:58.003 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77459.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77459.log.gz 2026-03-23T18:34:58.003 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82720.log 2026-03-23T18:34:58.003 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43402.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43402.log.gz 2026-03-23T18:34:58.003 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80898.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41058.log 2026-03-23T18:34:58.003 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80898.log.gz 2026-03-23T18:34:58.003 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40815.log 2026-03-23T18:34:58.004 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82720.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82720.log.gz 2026-03-23T18:34:58.004 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64161.log 2026-03-23T18:34:58.004 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.41058.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41058.log.gz 2026-03-23T18:34:58.004 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.40815.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33670.log 2026-03-23T18:34:58.004 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40815.log.gz 2026-03-23T18:34:58.004 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64161.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64161.log.gz 2026-03-23T18:34:58.004 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80812.log 2026-03-23T18:34:58.005 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72055.log 2026-03-23T18:34:58.005 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33670.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33670.log.gz 2026-03-23T18:34:58.005 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37212.log 2026-03-23T18:34:58.005 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80812.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80812.log.gz 2026-03-23T18:34:58.005 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72055.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71092.log 2026-03-23T18:34:58.005 INFO:teuthology.orchestra.run.vm04.stderr: 57.1% -- replaced with /var/log/ceph/ceph-client.admin.72055.log.gz 2026-03-23T18:34:58.005 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37212.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37212.log.gz 2026-03-23T18:34:58.005 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72693.log 2026-03-23T18:34:58.006 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71092.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65403.log 2026-03-23T18:34:58.006 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71092.log.gz 2026-03-23T18:34:58.006 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59941.log 2026-03-23T18:34:58.006 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72693.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47227.log 2026-03-23T18:34:58.006 INFO:teuthology.orchestra.run.vm04.stderr: 54.8% -- replaced with /var/log/ceph/ceph-client.admin.72693.log.gz 2026-03-23T18:34:58.006 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65403.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65403.log.gz 2026-03-23T18:34:58.006 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59941.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59941.log.gz 2026-03-23T18:34:58.007 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59825.log 2026-03-23T18:34:58.007 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30063.log 2026-03-23T18:34:58.007 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.47227.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47227.log.gz 2026-03-23T18:34:58.007 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58144.log 2026-03-23T18:34:58.007 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59825.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59825.log.gz 2026-03-23T18:34:58.007 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.30063.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41593.log 2026-03-23T18:34:58.007 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30063.log.gz 2026-03-23T18:34:58.007 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86646.log 2026-03-23T18:34:58.008 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58144.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58144.log.gz 2026-03-23T18:34:58.008 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77442.log 2026-03-23T18:34:58.008 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.41593.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41593.log.gz 2026-03-23T18:34:58.008 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86646.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38207.log 2026-03-23T18:34:58.008 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86646.log.gz 2026-03-23T18:34:58.008 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77442.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77442.log.gz 2026-03-23T18:34:58.008 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34363.log 2026-03-23T18:34:58.009 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35644.log 2026-03-23T18:34:58.009 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.38207.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38207.log.gz 2026-03-23T18:34:58.009 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43568.log 2026-03-23T18:34:58.009 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34363.log: /var/log/ceph/ceph-client.admin.35644.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56419.log 2026-03-23T18:34:58.009 INFO:teuthology.orchestra.run.vm04.stderr: 27.1% -- replaced with /var/log/ceph/ceph-client.admin.34363.log.gz 2026-03-23T18:34:58.009 INFO:teuthology.orchestra.run.vm04.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.35644.log.gz 2026-03-23T18:34:58.009 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43568.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43568.log.gz 2026-03-23T18:34:58.009 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31594.log 2026-03-23T18:34:58.010 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56419.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45280.log 2026-03-23T18:34:58.010 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56419.log.gz 2026-03-23T18:34:58.010 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31594.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82078.log 2026-03-23T18:34:58.010 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31594.log.gz 2026-03-23T18:34:58.010 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70409.log 2026-03-23T18:34:58.010 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.45280.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45280.log.gz 2026-03-23T18:34:58.010 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82078.log: gzip 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82078.log.gz -5 2026-03-23T18:34:58.010 INFO:teuthology.orchestra.run.vm04.stderr: --verbose -- /var/log/ceph/ceph-client.admin.54562.log 2026-03-23T18:34:58.010 INFO:teuthology.orchestra.run.vm04.stderr:gzip/var/log/ceph/ceph-client.admin.70409.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.63882.log 2026-03-23T18:34:58.011 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70409.log.gz 2026-03-23T18:34:58.011 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36736.log 2026-03-23T18:34:58.011 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.54562.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54562.log.gz 2026-03-23T18:34:58.011 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63882.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63882.log.gz 2026-03-23T18:34:58.011 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74557.log 2026-03-23T18:34:58.011 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26268.log 2026-03-23T18:34:58.011 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36736.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36736.log.gz 2026-03-23T18:34:58.011 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60181.log 2026-03-23T18:34:58.011 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74557.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74557.log.gz 2026-03-23T18:34:58.012 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.26268.log: gzip 0.0% -5 --verbose -- /var/log/ceph/ceph-client.admin.36685.log 2026-03-23T18:34:58.012 INFO:teuthology.orchestra.run.vm04.stderr: -- replaced with /var/log/ceph/ceph-client.admin.26268.log.gz 2026-03-23T18:34:58.012 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60181.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51863.log 2026-03-23T18:34:58.012 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60181.log.gz 2026-03-23T18:34:58.012 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53955.log 2026-03-23T18:34:58.012 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36685.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36685.log.gz 2026-03-23T18:34:58.012 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.51863.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.51863.log.gz -5 2026-03-23T18:34:58.012 INFO:teuthology.orchestra.run.vm04.stderr: --verbose -- /var/log/ceph/ceph-client.admin.37450.log 2026-03-23T18:34:58.012 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.53955.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59566.log 2026-03-23T18:34:58.013 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53955.log.gz 2026-03-23T18:34:58.013 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75307.log 2026-03-23T18:34:58.013 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37450.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37450.log.gz 2026-03-23T18:34:58.013 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59566.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34972.log 2026-03-23T18:34:58.013 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59566.log.gz 2026-03-23T18:34:58.013 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44914.log 2026-03-23T18:34:58.013 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75307.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75307.log.gz 2026-03-23T18:34:58.013 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46511.log 2026-03-23T18:34:58.014 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34972.log: /var/log/ceph/ceph-client.admin.44914.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60561.log 2026-03-23T18:34:58.014 INFO:teuthology.orchestra.run.vm04.stderr: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.34972.log.gz 2026-03-23T18:34:58.014 INFO:teuthology.orchestra.run.vm04.stderr: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.44914.log.gz 2026-03-23T18:34:58.014 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46511.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75430.log 2026-03-23T18:34:58.014 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46511.log.gz 2026-03-23T18:34:58.014 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61853.log 2026-03-23T18:34:58.014 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60561.log: /var/log/ceph/ceph-client.admin.75430.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75430.log.gz 2026-03-23T18:34:58.014 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86625.log 2026-03-23T18:34:58.014 INFO:teuthology.orchestra.run.vm04.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.60561.log.gz 2026-03-23T18:34:58.015 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65673.log 2026-03-23T18:34:58.015 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61853.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61853.log.gz 2026-03-23T18:34:58.015 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68866.log 2026-03-23T18:34:58.015 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86625.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86625.log.gz 2026-03-23T18:34:58.015 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65673.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65673.log.gz 2026-03-23T18:34:58.015 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32013.log 2026-03-23T18:34:58.015 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70689.log 2026-03-23T18:34:58.016 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68866.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68866.log.gz 2026-03-23T18:34:58.016 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43236.log 2026-03-23T18:34:58.016 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32013.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32013.log.gz 2026-03-23T18:34:58.016 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70689.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49428.log 2026-03-23T18:34:58.016 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70689.log.gz 2026-03-23T18:34:58.016 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25524.log 2026-03-23T18:34:58.016 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43236.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43236.log.gz 2026-03-23T18:34:58.016 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51027.log 2026-03-23T18:34:58.016 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49428.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49428.log.gz 2026-03-23T18:34:58.017 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.25524.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78987.log 2026-03-23T18:34:58.017 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25524.log.gz 2026-03-23T18:34:58.017 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48347.log 2026-03-23T18:34:58.017 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.51027.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51027.log.gz 2026-03-23T18:34:58.017 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41550.log 2026-03-23T18:34:58.017 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78987.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78987.log.gz 2026-03-23T18:34:58.017 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.48347.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85295.log 2026-03-23T18:34:58.017 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48347.log.gz 2026-03-23T18:34:58.017 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.41550.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41550.log.gz 2026-03-23T18:34:58.018 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63048.log 2026-03-23T18:34:58.018 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39444.log 2026-03-23T18:34:58.018 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85295.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85295.log.gz 2026-03-23T18:34:58.018 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30816.log 2026-03-23T18:34:58.018 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63048.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63048.log.gz 2026-03-23T18:34:58.018 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35801.log 2026-03-23T18:34:58.019 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60024.log 2026-03-23T18:34:58.019 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.30816.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30816.log.gz 2026-03-23T18:34:58.019 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.35801.log: /var/log/ceph/ceph-client.admin.39444.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35801.log.gz 2026-03-23T18:34:58.019 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68243.log 2026-03-23T18:34:58.019 INFO:teuthology.orchestra.run.vm04.stderr: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.39444.log.gz 2026-03-23T18:34:58.019 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60024.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60024.log.gz 2026-03-23T18:34:58.019 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38442.log 2026-03-23T18:34:58.019 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68243.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85848.log 2026-03-23T18:34:58.020 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68243.log.gz 2026-03-23T18:34:58.020 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.38442.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44957.log 2026-03-23T18:34:58.020 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38442.log.gz 2026-03-23T18:34:58.020 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65730.log 2026-03-23T18:34:58.020 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85848.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85848.log.gz 2026-03-23T18:34:58.020 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44957.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70753.log 2026-03-23T18:34:58.020 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44957.log.gz 2026-03-23T18:34:58.020 INFO:teuthology.orchestra.run.vm04.stderr:gzip/var/log/ceph/ceph-client.admin.65730.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.64180.log 2026-03-23T18:34:58.020 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65730.log.gz 2026-03-23T18:34:58.021 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59115.log 2026-03-23T18:34:58.021 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70753.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70753.log.gz 2026-03-23T18:34:58.021 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64180.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63708.log 2026-03-23T18:34:58.021 INFO:teuthology.orchestra.run.vm04.stderr: 53.8% -- replaced with /var/log/ceph/ceph-client.admin.64180.log.gz 2026-03-23T18:34:58.021 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60394.log 2026-03-23T18:34:58.021 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59115.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59115.log.gz 2026-03-23T18:34:58.021 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65989.log 2026-03-23T18:34:58.021 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63708.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63708.log.gz 2026-03-23T18:34:58.021 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60394.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35954.log 2026-03-23T18:34:58.021 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60394.log.gz 2026-03-23T18:34:58.022 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65989.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48882.log 2026-03-23T18:34:58.022 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65989.log.gz 2026-03-23T18:34:58.022 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39465.log 2026-03-23T18:34:58.022 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.35954.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35954.log.gz 2026-03-23T18:34:58.022 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.48882.log: gzip -5 --verbose -- 0.0% /var/log/ceph/ceph-client.admin.80092.log 2026-03-23T18:34:58.022 INFO:teuthology.orchestra.run.vm04.stderr: -- replaced with /var/log/ceph/ceph-client.admin.48882.log.gz 2026-03-23T18:34:58.023 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.39465.log: 26.7%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30868.log 2026-03-23T18:34:58.023 INFO:teuthology.orchestra.run.vm04.stderr: -- replaced with /var/log/ceph/ceph-client.admin.39465.log.gz 2026-03-23T18:34:58.023 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84487.log 2026-03-23T18:34:58.023 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80092.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80092.log.gz 2026-03-23T18:34:58.023 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.30868.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64086.log 0.0% 2026-03-23T18:34:58.023 INFO:teuthology.orchestra.run.vm04.stderr: -- replaced with /var/log/ceph/ceph-client.admin.30868.log.gz 2026-03-23T18:34:58.023 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75074.log 2026-03-23T18:34:58.023 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84487.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84487.log.gz 2026-03-23T18:34:58.023 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44750.log 2026-03-23T18:34:58.023 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64086.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64086.log.gz 2026-03-23T18:34:58.024 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75074.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38039.log 2026-03-23T18:34:58.024 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75074.log.gz 2026-03-23T18:34:58.024 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55881.log 2026-03-23T18:34:58.024 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44750.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44750.log.gz 2026-03-23T18:34:58.024 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81331.log 2026-03-23T18:34:58.024 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.38039.log: /var/log/ceph/ceph-client.admin.55881.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47205.log 2026-03-23T18:34:58.024 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55881.log.gz 2026-03-23T18:34:58.024 INFO:teuthology.orchestra.run.vm04.stderr: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.38039.log.gz 2026-03-23T18:34:58.025 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81331.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81331.log.gzgzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48984.log 2026-03-23T18:34:58.025 INFO:teuthology.orchestra.run.vm04.stderr: 2026-03-23T18:34:58.025 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70517.log 2026-03-23T18:34:58.025 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.47205.log: /var/log/ceph/ceph-client.admin.48984.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47205.log.gz 2026-03-23T18:34:58.025 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30581.log 2026-03-23T18:34:58.025 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48984.log.gz 2026-03-23T18:34:58.025 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70517.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89659.log 2026-03-23T18:34:58.025 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70517.log.gz 2026-03-23T18:34:58.025 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50160.log 2026-03-23T18:34:58.026 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.30581.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30581.log.gz 2026-03-23T18:34:58.026 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89659.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70796.log 2026-03-23T18:34:58.026 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89659.log.gz 2026-03-23T18:34:58.026 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62716.log 2026-03-23T18:34:58.026 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50160.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50160.log.gz 2026-03-23T18:34:58.026 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60929.log 2026-03-23T18:34:58.026 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70796.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70796.log.gz 2026-03-23T18:34:58.026 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62716.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62716.log.gz 2026-03-23T18:34:58.027 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83815.log 2026-03-23T18:34:58.027 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60929.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89035.log 2026-03-23T18:34:58.027 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60929.log.gz 2026-03-23T18:34:58.027 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57160.log 2026-03-23T18:34:58.027 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83815.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83815.log.gz 2026-03-23T18:34:58.027 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30645.log 2026-03-23T18:34:58.027 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89035.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89035.log.gz 2026-03-23T18:34:58.027 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66341.log 2026-03-23T18:34:58.028 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.57160.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57160.log.gz 2026-03-23T18:34:58.028 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.30645.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76822.log 2026-03-23T18:34:58.028 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58186.log 2026-03-23T18:34:58.028 INFO:teuthology.orchestra.run.vm04.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.30645.log.gz 2026-03-23T18:34:58.028 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66341.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66341.log.gz 2026-03-23T18:34:58.028 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76822.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76822.log.gz 2026-03-23T18:34:58.028 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44067.log 2026-03-23T18:34:58.028 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39677.log 2026-03-23T18:34:58.029 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58186.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58186.log.gz 2026-03-23T18:34:58.029 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44067.log: gzip -5 --verbose -- 0.0% /var/log/ceph/ceph-client.admin.89938.log 2026-03-23T18:34:58.029 INFO:teuthology.orchestra.run.vm04.stderr: -- replaced with /var/log/ceph/ceph-client.admin.44067.log.gz 2026-03-23T18:34:58.029 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.39677.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90515.log 2026-03-23T18:34:58.029 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39677.log.gz 2026-03-23T18:34:58.029 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63301.log 2026-03-23T18:34:58.029 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89938.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89938.log.gz 2026-03-23T18:34:58.029 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90515.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53912.log 2026-03-23T18:34:58.029 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90515.log.gz 2026-03-23T18:34:58.030 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63301.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73323.log 2026-03-23T18:34:58.030 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63301.log.gz 2026-03-23T18:34:58.030 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74793.log 2026-03-23T18:34:58.030 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.53912.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53912.log.gz 2026-03-23T18:34:58.030 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73323.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39806.log 2026-03-23T18:34:58.030 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73323.log.gz 2026-03-23T18:34:58.030 INFO:teuthology.orchestra.run.vm04.stderr:gzip/var/log/ceph/ceph-client.admin.74793.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.34636.log 2026-03-23T18:34:58.030 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74793.log.gz 2026-03-23T18:34:58.031 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54381.log 2026-03-23T18:34:58.031 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.39806.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39806.log.gz 2026-03-23T18:34:58.031 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34636.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42460.log 2026-03-23T18:34:58.031 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.54381.log: 27.1% -- replaced with /var/log/ceph/ceph-client.admin.34636.log.gz 2026-03-23T18:34:58.031 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54381.log.gz 2026-03-23T18:34:58.031 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71587.log 2026-03-23T18:34:58.031 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42460.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59287.log 2026-03-23T18:34:58.031 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42460.log.gz 2026-03-23T18:34:58.032 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60825.log 2026-03-23T18:34:58.032 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71587.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71587.log.gz 2026-03-23T18:34:58.032 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30192.log 2026-03-23T18:34:58.032 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59287.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59287.log.gz 2026-03-23T18:34:58.032 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60825.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51133.log 2026-03-23T18:34:58.032 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60825.log.gz 2026-03-23T18:34:58.032 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49600.log 2026-03-23T18:34:58.032 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.30192.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30192.log.gz 2026-03-23T18:34:58.032 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50568.log 2026-03-23T18:34:58.033 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.51133.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51133.log.gz 2026-03-23T18:34:58.033 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49600.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61509.log 2026-03-23T18:34:58.033 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49600.log.gz 2026-03-23T18:34:58.033 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40599.log 2026-03-23T18:34:58.033 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50568.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50568.log.gz 2026-03-23T18:34:58.033 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65261.log 2026-03-23T18:34:58.033 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61509.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61509.log.gz 2026-03-23T18:34:58.033 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.40599.log: gzip -5 0.0% --verbose -- -- replaced with /var/log/ceph/ceph-client.admin.40599.log.gz /var/log/ceph/ceph-client.admin.44230.log 2026-03-23T18:34:58.033 INFO:teuthology.orchestra.run.vm04.stderr: 2026-03-23T18:34:58.034 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65261.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64431.log 2026-03-23T18:34:58.034 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65261.log.gz 2026-03-23T18:34:58.034 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85784.log 2026-03-23T18:34:58.034 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44230.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44230.log.gz 2026-03-23T18:34:58.034 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64431.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64675.log 2026-03-23T18:34:58.034 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64431.log.gz 2026-03-23T18:34:58.034 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39956.log 2026-03-23T18:34:58.034 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85784.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85784.log.gz 2026-03-23T18:34:58.034 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61724.log 2026-03-23T18:34:58.035 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64675.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64675.log.gz 2026-03-23T18:34:58.035 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.39956.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82355.log 2026-03-23T18:34:58.035 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39956.log.gz 2026-03-23T18:34:58.035 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61423.log 2026-03-23T18:34:58.035 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61724.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61724.log.gz 2026-03-23T18:34:58.035 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88643.log 2026-03-23T18:34:58.035 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82355.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82355.log.gz 2026-03-23T18:34:58.035 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61423.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41184.log 2026-03-23T18:34:58.035 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61423.log.gz 2026-03-23T18:34:58.036 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53440.log 2026-03-23T18:34:58.036 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88643.log: 59.3% -- replaced with /var/log/ceph/ceph-client.admin.88643.log.gz 2026-03-23T18:34:58.036 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51218.log 2026-03-23T18:34:58.036 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.41184.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41184.log.gz 2026-03-23T18:34:58.036 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.53440.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87071.log 2026-03-23T18:34:58.036 INFO:teuthology.orchestra.run.vm04.stderr: 56.0% -- replaced with /var/log/ceph/ceph-client.admin.53440.log.gz 2026-03-23T18:34:58.036 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.51218.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51218.log.gz 2026-03-23T18:34:58.039 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74479.log 2026-03-23T18:34:58.039 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82699.log 2026-03-23T18:34:58.039 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87071.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87071.log.gz 2026-03-23T18:34:58.039 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34132.log 2026-03-23T18:34:58.039 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74479.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74479.log.gz 2026-03-23T18:34:58.039 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82699.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51755.log 2026-03-23T18:34:58.039 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82699.log.gz 2026-03-23T18:34:58.040 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34132.log: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.34132.log.gz 2026-03-23T18:34:58.040 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27608.log 2026-03-23T18:34:58.040 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86969.log 2026-03-23T18:34:58.040 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.51755.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51755.log.gz 2026-03-23T18:34:58.040 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58943.log 2026-03-23T18:34:58.040 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27608.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27608.log.gz 2026-03-23T18:34:58.040 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86969.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81352.log 2026-03-23T18:34:58.040 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86969.log.gz 2026-03-23T18:34:58.041 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58943.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58943.log.gz 2026-03-23T18:34:58.041 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33198.log 2026-03-23T18:34:58.041 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78065.log 2026-03-23T18:34:58.041 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81352.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81352.log.gz 2026-03-23T18:34:58.041 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63279.log 2026-03-23T18:34:58.042 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33198.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33198.log.gz 2026-03-23T18:34:58.042 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78065.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39612.log 2026-03-23T18:34:58.042 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78065.log.gz 2026-03-23T18:34:58.042 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63279.log: 58.2% -- replaced with /var/log/ceph/ceph-client.admin.63279.log.gz 2026-03-23T18:34:58.042 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50778.log 2026-03-23T18:34:58.042 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84788.log 2026-03-23T18:34:58.042 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.39612.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39612.log.gz 2026-03-23T18:34:58.042 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41249.log 2026-03-23T18:34:58.043 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50778.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50778.log.gz 2026-03-23T18:34:58.043 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84788.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74061.log 2026-03-23T18:34:58.043 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84788.log.gz 2026-03-23T18:34:58.043 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.41249.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41249.log.gz 2026-03-23T18:34:58.043 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67920.log 2026-03-23T18:34:58.043 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86861.log 2026-03-23T18:34:58.044 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74061.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74061.log.gz 2026-03-23T18:34:58.044 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43709.log 2026-03-23T18:34:58.044 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67920.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67920.log.gz 2026-03-23T18:34:58.044 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86861.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79176.log 2026-03-23T18:34:58.044 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86861.log.gz 2026-03-23T18:34:58.044 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76368.log 2026-03-23T18:34:58.044 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43709.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43709.log.gz 2026-03-23T18:34:58.044 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37829.log 2026-03-23T18:34:58.044 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79176.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79176.log.gz 2026-03-23T18:34:58.044 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76368.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28306.log 2026-03-23T18:34:58.045 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76368.log.gz 2026-03-23T18:34:58.045 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37829.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64122.log 2026-03-23T18:34:58.045 INFO:teuthology.orchestra.run.vm04.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.37829.log.gz 2026-03-23T18:34:58.045 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54645.log 2026-03-23T18:34:58.045 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64122.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71759.log 2026-03-23T18:34:58.046 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.28306.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64122.log.gz 2026-03-23T18:34:58.046 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28306.log.gz 2026-03-23T18:34:58.046 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.54645.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54645.log.gz 2026-03-23T18:34:58.046 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42986.log 2026-03-23T18:34:58.046 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89723.log 2026-03-23T18:34:58.046 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71759.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71759.log.gz 2026-03-23T18:34:58.046 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85024.log 2026-03-23T18:34:58.046 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42986.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42986.log.gz 2026-03-23T18:34:58.046 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63339.log 2026-03-23T18:34:58.047 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89723.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89723.log.gz 2026-03-23T18:34:58.047 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69296.log 2026-03-23T18:34:58.047 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85024.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85024.log.gz 2026-03-23T18:34:58.047 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55773.log 2026-03-23T18:34:58.047 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63339.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63339.log.gz 2026-03-23T18:34:58.047 INFO:teuthology.orchestra.run.vm04.stderr:gzip/var/log/ceph/ceph-client.admin.69296.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.67687.log 2026-03-23T18:34:58.047 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69296.log.gz 2026-03-23T18:34:58.047 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.55773.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55773.log.gz 2026-03-23T18:34:58.048 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31765.log 2026-03-23T18:34:58.048 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41829.log 2026-03-23T18:34:58.048 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67687.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67687.log.gz 2026-03-23T18:34:58.048 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43296.log 2026-03-23T18:34:58.048 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31765.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31765.log.gz 2026-03-23T18:34:58.048 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.41829.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42481.log 2026-03-23T18:34:58.048 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41829.log.gz 2026-03-23T18:34:58.048 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51820.log 2026-03-23T18:34:58.049 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43296.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63963.log 2026-03-23T18:34:58.049 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42481.log: 52.8% -- replaced with /var/log/ceph/ceph-client.admin.43296.log.gz 2026-03-23T18:34:58.049 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.51820.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51820.log.gz 2026-03-23T18:34:58.049 INFO:teuthology.orchestra.run.vm04.stderr: 26.1%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42119.log 2026-03-23T18:34:58.049 INFO:teuthology.orchestra.run.vm04.stderr: -- replaced with /var/log/ceph/ceph-client.admin.42481.log.gz 2026-03-23T18:34:58.049 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69522.log 2026-03-23T18:34:58.049 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63963.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63963.log.gz 2026-03-23T18:34:58.049 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42119.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.42119.log.gz -5 2026-03-23T18:34:58.049 INFO:teuthology.orchestra.run.vm04.stderr: --verbose -- /var/log/ceph/ceph-client.admin.64755.log 2026-03-23T18:34:58.050 INFO:teuthology.orchestra.run.vm04.stderr:gzip/var/log/ceph/ceph-client.admin.69522.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.88004.log 2026-03-23T18:34:58.050 INFO:teuthology.orchestra.run.vm04.stderr: 25.7% -- replaced with /var/log/ceph/ceph-client.admin.69522.log.gz 2026-03-23T18:34:58.050 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80128.log 2026-03-23T18:34:58.050 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64755.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64755.log.gz 2026-03-23T18:34:58.050 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88004.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88004.log.gz 2026-03-23T18:34:58.050 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51113.log 2026-03-23T18:34:58.050 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37469.log 2026-03-23T18:34:58.051 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80128.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80128.log.gz 2026-03-23T18:34:58.051 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54341.log 2026-03-23T18:34:58.051 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.51113.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51113.log.gz 2026-03-23T18:34:58.051 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37469.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78344.log 2026-03-23T18:34:58.051 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73947.log 2026-03-23T18:34:58.051 INFO:teuthology.orchestra.run.vm04.stderr: 54.6% -- replaced with /var/log/ceph/ceph-client.admin.37469.log.gz 2026-03-23T18:34:58.051 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.54341.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54341.log.gz 2026-03-23T18:34:58.051 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78344.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78344.log.gz 2026-03-23T18:34:58.051 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36022.log 2026-03-23T18:34:58.052 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48967.log 2026-03-23T18:34:58.052 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73947.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73947.log.gz 2026-03-23T18:34:58.052 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36022.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55924.log 2026-03-23T18:34:58.052 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36022.log.gz 2026-03-23T18:34:58.052 INFO:teuthology.orchestra.run.vm04.stderr:gzip/var/log/ceph/ceph-client.admin.48967.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.25726.log 2026-03-23T18:34:58.052 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48967.log.gz 2026-03-23T18:34:58.052 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32739.log 2026-03-23T18:34:58.052 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.55924.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55924.log.gz 2026-03-23T18:34:58.053 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.25726.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53420.log 2026-03-23T18:34:58.053 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25726.log.gz 2026-03-23T18:34:58.053 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50461.log 2026-03-23T18:34:58.053 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32739.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32739.log.gz 2026-03-23T18:34:58.053 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70173.log 2026-03-23T18:34:58.053 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.53420.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53420.log.gz 2026-03-23T18:34:58.053 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50461.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50461.log.gzgzip -5 --verbose -- 2026-03-23T18:34:58.053 INFO:teuthology.orchestra.run.vm04.stderr: /var/log/ceph/ceph-client.admin.40428.log 2026-03-23T18:34:58.053 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43276.log 2026-03-23T18:34:58.054 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70173.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70173.log.gz 2026-03-23T18:34:58.054 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74440.log 2026-03-23T18:34:58.054 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.40428.log: /var/log/ceph/ceph-client.admin.43276.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40021.log 2026-03-23T18:34:58.054 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43276.log.gz 2026-03-23T18:34:58.054 INFO:teuthology.orchestra.run.vm04.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.40428.log.gz 2026-03-23T18:34:58.054 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74440.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65928.log 2026-03-23T18:34:58.054 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74440.log.gz 2026-03-23T18:34:58.054 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84030.log 2026-03-23T18:34:58.055 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.40021.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40021.log.gz 2026-03-23T18:34:58.055 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65928.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65928.log.gz 2026-03-23T18:34:58.055 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27550.log 2026-03-23T18:34:58.055 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60160.log 2026-03-23T18:34:58.055 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84030.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84030.log.gz 2026-03-23T18:34:58.055 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28155.log 2026-03-23T18:34:58.055 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27550.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27550.log.gz 2026-03-23T18:34:58.055 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60160.log: gzip 0.0% -5 --verbose -- replaced with /var/log/ceph/ceph-client.admin.60160.log.gz -- 2026-03-23T18:34:58.055 INFO:teuthology.orchestra.run.vm04.stderr: /var/log/ceph/ceph-client.admin.78542.log 2026-03-23T18:34:58.055 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42181.log 2026-03-23T18:34:58.056 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.28155.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28155.log.gz 2026-03-23T18:34:58.056 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87810.log 2026-03-23T18:34:58.056 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78542.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78542.log.gz 2026-03-23T18:34:58.056 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42181.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82849.log 2026-03-23T18:34:58.056 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42181.log.gz 2026-03-23T18:34:58.056 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46314.log 2026-03-23T18:34:58.056 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87810.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87810.log.gz 2026-03-23T18:34:58.056 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84116.log 2026-03-23T18:34:58.056 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82849.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82849.log.gz 2026-03-23T18:34:58.057 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46314.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47764.log 2026-03-23T18:34:58.057 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46314.log.gz 2026-03-23T18:34:58.057 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41227.log 2026-03-23T18:34:58.057 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84116.log: 29.3% -- replaced with /var/log/ceph/ceph-client.admin.84116.log.gz 2026-03-23T18:34:58.057 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76671.log 2026-03-23T18:34:58.057 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.47764.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47764.log.gz 2026-03-23T18:34:58.057 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.41227.log: gzip -5 --verbose -- 0.0% /var/log/ceph/ceph-client.admin.30679.log 2026-03-23T18:34:58.057 INFO:teuthology.orchestra.run.vm04.stderr: -- replaced with /var/log/ceph/ceph-client.admin.41227.log.gz 2026-03-23T18:34:58.057 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73664.log 2026-03-23T18:34:58.058 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76671.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76671.log.gz 2026-03-23T18:34:58.058 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51454.log 2026-03-23T18:34:58.058 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.30679.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30679.log.gz 2026-03-23T18:34:58.058 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73664.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.23015.log 2026-03-23T18:34:58.058 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73664.log.gz 2026-03-23T18:34:58.058 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79251.log 2026-03-23T18:34:58.058 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.51454.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48506.log 2026-03-23T18:34:58.059 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51454.log.gz 2026-03-23T18:34:58.059 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.23015.log: /var/log/ceph/ceph-client.admin.79251.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79251.log.gz 2026-03-23T18:34:58.059 INFO:teuthology.orchestra.run.vm04.stderr: 26.9% -- replaced with /var/log/ceph/ceph-client.admin.23015.log.gz 2026-03-23T18:34:58.059 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76951.log 2026-03-23T18:34:58.059 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56226.log 2026-03-23T18:34:58.059 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.48506.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48506.log.gz 2026-03-23T18:34:58.059 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62638.log 2026-03-23T18:34:58.059 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76951.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76951.log.gz 2026-03-23T18:34:58.060 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28456.log 2026-03-23T18:34:58.060 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56226.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56226.log.gz 2026-03-23T18:34:58.060 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62638.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62638.log.gz 2026-03-23T18:34:58.060 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72375.log 2026-03-23T18:34:58.060 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59437.log 2026-03-23T18:34:58.060 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.28456.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28456.log.gz 2026-03-23T18:34:58.060 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81949.log 2026-03-23T18:34:58.061 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72375.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72375.log.gz 2026-03-23T18:34:58.061 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59437.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26376.log 2026-03-23T18:34:58.061 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59437.log.gz 2026-03-23T18:34:58.061 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81949.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81949.log.gz 2026-03-23T18:34:58.061 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48696.log 2026-03-23T18:34:58.061 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56548.log 2026-03-23T18:34:58.061 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.26376.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26376.log.gz 2026-03-23T18:34:58.062 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30472.log 2026-03-23T18:34:58.062 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.48696.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48696.log.gz 2026-03-23T18:34:58.062 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56548.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74736.log 2026-03-23T18:34:58.062 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56548.log.gz 2026-03-23T18:34:58.062 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.30472.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30472.log.gz 2026-03-23T18:34:58.062 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61251.log 2026-03-23T18:34:58.062 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77979.log 2026-03-23T18:34:58.063 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74736.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74736.log.gz 2026-03-23T18:34:58.063 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61251.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54401.log 2026-03-23T18:34:58.063 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61251.log.gz 2026-03-23T18:34:58.063 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46547.log 2026-03-23T18:34:58.063 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77979.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77979.log.gz 2026-03-23T18:34:58.063 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84315.log 2026-03-23T18:34:58.063 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.54401.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54401.log.gz 2026-03-23T18:34:58.063 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46547.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46547.log.gzgzip 2026-03-23T18:34:58.063 INFO:teuthology.orchestra.run.vm04.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.67480.log 2026-03-23T18:34:58.064 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80296.log 2026-03-23T18:34:58.064 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84315.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84315.log.gz 2026-03-23T18:34:58.064 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49106.log 2026-03-23T18:34:58.064 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67480.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67480.log.gz 2026-03-23T18:34:58.064 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80296.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25634.log 2026-03-23T18:34:58.064 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80296.log.gz 2026-03-23T18:34:58.064 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39528.log 2026-03-23T18:34:58.064 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49106.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49106.log.gz 2026-03-23T18:34:58.064 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45602.log 2026-03-23T18:34:58.065 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.25634.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25634.log.gz 2026-03-23T18:34:58.065 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.39528.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48234.log 2026-03-23T18:34:58.065 INFO:teuthology.orchestra.run.vm04.stderr: 26.8% -- replaced with /var/log/ceph/ceph-client.admin.39528.log.gz 2026-03-23T18:34:58.065 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.45602.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45602.log.gz 2026-03-23T18:34:58.065 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67337.log 2026-03-23T18:34:58.065 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.48234.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36328.log 2026-03-23T18:34:58.065 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48234.log.gz 2026-03-23T18:34:58.065 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29226.log 2026-03-23T18:34:58.066 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67337.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67337.log.gz 2026-03-23T18:34:58.066 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27456.log 2026-03-23T18:34:58.066 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36328.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36328.log.gz 2026-03-23T18:34:58.066 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85440.log 2026-03-23T18:34:58.066 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29226.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29226.log.gz 2026-03-23T18:34:58.066 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27456.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27456.log.gz 2026-03-23T18:34:58.067 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76217.log 2026-03-23T18:34:58.067 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30776.log 2026-03-23T18:34:58.067 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85440.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85440.log.gz 2026-03-23T18:34:58.067 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55039.log 2026-03-23T18:34:58.067 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76217.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76217.log.gz 2026-03-23T18:34:58.067 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.30776.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33838.log 2026-03-23T18:34:58.067 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30776.log.gz 2026-03-23T18:34:58.068 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.55039.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55039.log.gz 2026-03-23T18:34:58.068 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60199.log 2026-03-23T18:34:58.068 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62699.log 2026-03-23T18:34:58.068 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33838.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54705.log 2026-03-23T18:34:58.068 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60199.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60199.log.gz 2026-03-23T18:34:58.068 INFO:teuthology.orchestra.run.vm04.stderr: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.33838.log.gz 2026-03-23T18:34:58.069 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62699.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62699.log.gz 2026-03-23T18:34:58.069 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90153.log 2026-03-23T18:34:58.069 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79645.log 2026-03-23T18:34:58.069 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.54705.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54705.log.gz 2026-03-23T18:34:58.069 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55141.log 2026-03-23T18:34:58.069 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90153.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90153.log.gz 2026-03-23T18:34:58.069 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79645.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78129.log 2026-03-23T18:34:58.069 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79645.log.gz 2026-03-23T18:34:58.069 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88025.log 2026-03-23T18:34:58.069 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.55141.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55141.log.gz 2026-03-23T18:34:58.070 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78795.log 2026-03-23T18:34:58.070 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78129.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78129.log.gz 2026-03-23T18:34:58.070 INFO:teuthology.orchestra.run.vm04.stderr:gzip/var/log/ceph/ceph-client.admin.88025.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.43631.log 2026-03-23T18:34:58.070 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88025.log.gz 2026-03-23T18:34:58.070 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78795.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78795.log.gz 2026-03-23T18:34:58.070 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51411.log 2026-03-23T18:34:58.071 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77144.log 2026-03-23T18:34:58.071 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43631.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43631.log.gz 2026-03-23T18:34:58.071 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.51411.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49363.log 2026-03-23T18:34:58.071 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51411.log.gz 2026-03-23T18:34:58.071 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80189.log 2026-03-23T18:34:58.071 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77144.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77144.log.gz 2026-03-23T18:34:58.071 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57245.log 2026-03-23T18:34:58.071 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49363.log: /var/log/ceph/ceph-client.admin.80189.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.49363.log.gz 2026-03-23T18:34:58.071 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80189.log.gz 2026-03-23T18:34:58.071 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43464.log 2026-03-23T18:34:58.072 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.57245.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75701.log 2026-03-23T18:34:58.072 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57245.log.gz 2026-03-23T18:34:58.072 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83103.log 2026-03-23T18:34:58.072 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43464.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43464.log.gz 2026-03-23T18:34:58.072 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75701.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58524.log 2026-03-23T18:34:58.072 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75701.log.gz 2026-03-23T18:34:58.072 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30715.log 2026-03-23T18:34:58.072 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83103.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83103.log.gz 2026-03-23T18:34:58.073 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34447.log 2026-03-23T18:34:58.073 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58524.log: /var/log/ceph/ceph-client.admin.30715.log: 43.9% -- replaced with /var/log/ceph/ceph-client.admin.58524.log.gz 2026-03-23T18:34:58.073 INFO:teuthology.orchestra.run.vm04.stderr:gzip 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30715.log.gz -5 2026-03-23T18:34:58.073 INFO:teuthology.orchestra.run.vm04.stderr: --verbose -- /var/log/ceph/ceph-client.admin.34993.log 2026-03-23T18:34:58.073 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34447.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54148.log 2026-03-23T18:34:58.073 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38904.log 2026-03-23T18:34:58.073 INFO:teuthology.orchestra.run.vm04.stderr: 25.6%/var/log/ceph/ceph-client.admin.34993.log: -- replaced with /var/log/ceph/ceph-client.admin.34447.log.gz 2026-03-23T18:34:58.073 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.54148.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54148.log.gz 2026-03-23T18:34:58.073 INFO:teuthology.orchestra.run.vm04.stderr: 25.0% -- replaced with /var/log/ceph/ceph-client.admin.34993.log.gz 2026-03-23T18:34:58.074 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76628.log 2026-03-23T18:34:58.074 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75680.log 2026-03-23T18:34:58.074 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.38904.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90003.log 2026-03-23T18:34:58.074 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76628.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76628.log.gz 2026-03-23T18:34:58.074 INFO:teuthology.orchestra.run.vm04.stderr: 25.7% -- replaced with /var/log/ceph/ceph-client.admin.38904.log.gz 2026-03-23T18:34:58.075 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68200.log 2026-03-23T18:34:58.075 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75680.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75680.log.gz 2026-03-23T18:34:58.075 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62927.log 2026-03-23T18:34:58.075 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90003.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90003.log.gz 2026-03-23T18:34:58.075 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31006.log 2026-03-23T18:34:58.075 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68200.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68200.log.gz 2026-03-23T18:34:58.075 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71931.log 2026-03-23T18:34:58.075 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62927.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62927.log.gz 2026-03-23T18:34:58.076 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31006.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77234.log 2026-03-23T18:34:58.076 INFO:teuthology.orchestra.run.vm04.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.31006.log.gz 2026-03-23T18:34:58.076 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40541.log 2026-03-23T18:34:58.076 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71931.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33351.log 2026-03-23T18:34:58.076 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77234.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71931.log.gz 2026-03-23T18:34:58.076 INFO:teuthology.orchestra.run.vm04.stderr: -- replaced with /var/log/ceph/ceph-client.admin.77234.log.gz 2026-03-23T18:34:58.076 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.40541.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58669.log 2026-03-23T18:34:58.076 INFO:teuthology.orchestra.run.vm04.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.40541.log.gz 2026-03-23T18:34:58.077 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74810.log 2026-03-23T18:34:58.077 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33351.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33351.log.gz 2026-03-23T18:34:58.077 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43849.log 2026-03-23T18:34:58.077 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58669.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58669.log.gz 2026-03-23T18:34:58.077 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74810.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30560.log 2026-03-23T18:34:58.077 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74810.log.gz 2026-03-23T18:34:58.077 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43849.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43849.log.gz 2026-03-23T18:34:58.077 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86754.log 2026-03-23T18:34:58.078 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72335.log 2026-03-23T18:34:58.078 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.30560.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30560.log.gz 2026-03-23T18:34:58.078 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45387.log 2026-03-23T18:34:58.078 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86754.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86754.log.gz 2026-03-23T18:34:58.078 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72335.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77699.log 2026-03-23T18:34:58.078 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72335.log.gz 2026-03-23T18:34:58.078 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.45387.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45387.log.gz 2026-03-23T18:34:58.079 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58383.log 2026-03-23T18:34:58.079 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75188.log 2026-03-23T18:34:58.079 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77699.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77699.log.gz 2026-03-23T18:34:58.079 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39030.log 2026-03-23T18:34:58.079 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58383.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58383.log.gz 2026-03-23T18:34:58.079 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75188.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59804.log 2026-03-23T18:34:58.079 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75188.log.gz 2026-03-23T18:34:58.079 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36855.log 2026-03-23T18:34:58.080 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.39030.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60662.log 2026-03-23T18:34:58.080 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59804.log: 25.3% -- replaced with /var/log/ceph/ceph-client.admin.39030.log.gz 2026-03-23T18:34:58.080 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59804.log.gz 2026-03-23T18:34:58.080 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36855.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36855.log.gz 2026-03-23T18:34:58.080 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37297.log 2026-03-23T18:34:58.080 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79921.log 2026-03-23T18:34:58.080 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60662.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60662.log.gz 2026-03-23T18:34:58.080 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42057.log 2026-03-23T18:34:58.081 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37297.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37297.log.gz 2026-03-23T18:34:58.081 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79921.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48213.log 2026-03-23T18:34:58.081 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79921.log.gz 2026-03-23T18:34:58.081 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42057.log: 58.2% -- replaced with /var/log/ceph/ceph-client.admin.42057.log.gz 2026-03-23T18:34:58.081 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54850.log 2026-03-23T18:34:58.081 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85165.log 2026-03-23T18:34:58.082 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.48213.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65593.log 2026-03-23T18:34:58.082 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48213.log.gz 2026-03-23T18:34:58.082 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.54850.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54850.log.gz 2026-03-23T18:34:58.082 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85165.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85165.log.gz 2026-03-23T18:34:58.082 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89530.log 2026-03-23T18:34:58.082 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65593.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65593.log.gz 2026-03-23T18:34:58.082 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78619.log 2026-03-23T18:34:58.083 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89530.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89530.log.gz 2026-03-23T18:34:58.083 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67232.log 2026-03-23T18:34:58.083 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78619.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41679.log 2026-03-23T18:34:58.083 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78619.log.gz 2026-03-23T18:34:58.083 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65710.log 2026-03-23T18:34:58.083 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67232.log: /var/log/ceph/ceph-client.admin.41679.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67232.log.gz 2026-03-23T18:34:58.084 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41679.log.gz 2026-03-23T18:34:58.084 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43829.log 2026-03-23T18:34:58.084 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65710.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66575.log 2026-03-23T18:34:58.084 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65710.log.gz 2026-03-23T18:34:58.084 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38421.log 2026-03-23T18:34:58.084 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43829.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43829.log.gz 2026-03-23T18:34:58.084 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66575.log: gzip -5 0.0% --verbose -- -- replaced with /var/log/ceph/ceph-client.admin.66575.log.gz 2026-03-23T18:34:58.084 INFO:teuthology.orchestra.run.vm04.stderr: /var/log/ceph/ceph-client.admin.55205.log 2026-03-23T18:34:58.084 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.38421.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63530.log 2026-03-23T18:34:58.085 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70067.log 2026-03-23T18:34:58.085 INFO:teuthology.orchestra.run.vm04.stderr: 27.1% -- replaced with /var/log/ceph/ceph-client.admin.38421.log.gz 2026-03-23T18:34:58.085 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.55205.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55205.log.gz 2026-03-23T18:34:58.085 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63530.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63530.log.gz 2026-03-23T18:34:58.085 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55644.log 2026-03-23T18:34:58.085 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63010.log 2026-03-23T18:34:58.086 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70067.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70067.log.gz 2026-03-23T18:34:58.086 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.55644.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47877.log 2026-03-23T18:34:58.086 INFO:teuthology.orchestra.run.vm04.stderr: 25.7% -- replaced with /var/log/ceph/ceph-client.admin.55644.log.gz 2026-03-23T18:34:58.086 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63010.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45344.log 2026-03-23T18:34:58.086 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63010.log.gz 2026-03-23T18:34:58.087 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.47877.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46983.log 2026-03-23T18:34:58.087 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47877.log.gz 2026-03-23T18:34:58.087 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54321.log/var/log/ceph/ceph-client.admin.45344.log: 2026-03-23T18:34:58.087 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45344.log.gz 2026-03-23T18:34:58.087 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32354.log 2026-03-23T18:34:58.087 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46983.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46983.log.gz 2026-03-23T18:34:58.087 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80210.log 2026-03-23T18:34:58.087 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.54321.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54321.log.gz 2026-03-23T18:34:58.087 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32354.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35581.log 2026-03-23T18:34:58.087 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32354.log.gz 2026-03-23T18:34:58.088 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80210.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80210.log.gz 2026-03-23T18:34:58.088 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69146.log 2026-03-23T18:34:58.088 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25576.log 2026-03-23T18:34:58.088 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.35581.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86668.log 2026-03-23T18:34:58.088 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69146.log: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.35581.log.gz 2026-03-23T18:34:58.088 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69146.log.gz 2026-03-23T18:34:58.089 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.25576.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25576.log.gz 2026-03-23T18:34:58.089 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79557.log 2026-03-23T18:34:58.089 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51691.log 2026-03-23T18:34:58.089 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86668.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86668.log.gz 2026-03-23T18:34:58.089 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47098.log 2026-03-23T18:34:58.089 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79557.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79557.log.gz 2026-03-23T18:34:58.089 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.51691.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26069.log 2026-03-23T18:34:58.089 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51691.log.gz 2026-03-23T18:34:58.089 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57757.log 2026-03-23T18:34:58.090 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.47098.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47098.log.gz 2026-03-23T18:34:58.090 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86388.log 2026-03-23T18:34:58.090 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.26069.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26069.log.gz 2026-03-23T18:34:58.090 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.57757.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68092.log 2026-03-23T18:34:58.090 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57757.log.gz 2026-03-23T18:34:58.090 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86388.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86388.log.gz 2026-03-23T18:34:58.091 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70560.log 2026-03-23T18:34:58.091 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76303.log 2026-03-23T18:34:58.091 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68092.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68092.log.gz 2026-03-23T18:34:58.091 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76435.log 2026-03-23T18:34:58.091 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70560.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70560.log.gz 2026-03-23T18:34:58.091 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76303.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27142.log 2026-03-23T18:34:58.091 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76303.log.gz 2026-03-23T18:34:58.092 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76435.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76435.log.gz 2026-03-23T18:34:58.092 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77058.log 2026-03-23T18:34:58.092 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph.log 2026-03-23T18:34:58.092 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27142.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27142.log.gz 2026-03-23T18:34:58.092 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51841.log 2026-03-23T18:34:58.092 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77058.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77058.log.gz 2026-03-23T18:34:58.092 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35329.log 2026-03-23T18:34:58.093 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.51841.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51841.log.gz 2026-03-23T18:34:58.093 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57371.log 2026-03-23T18:34:58.093 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.57371.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57371.log.gz 2026-03-23T18:34:58.094 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30407.log 2026-03-23T18:34:58.094 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76994.log 2026-03-23T18:34:58.094 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.35329.log: /var/log/ceph/ceph-client.admin.30407.log: 25.3% -- replaced with /var/log/ceph/ceph-client.admin.35329.log.gz 2026-03-23T18:34:58.094 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30407.log.gz 2026-03-23T18:34:58.094 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40501.log 2026-03-23T18:34:58.094 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41206.log 2026-03-23T18:34:58.095 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76994.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76994.log.gz 2026-03-23T18:34:58.095 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.40501.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71991.log 2026-03-23T18:34:58.095 INFO:teuthology.orchestra.run.vm04.stderr: 89.7% -- replaced with /var/log/ceph/ceph.log.gz 2026-03-23T18:34:58.095 INFO:teuthology.orchestra.run.vm04.stderr: 25.7% -- replaced with /var/log/ceph/ceph-client.admin.40501.log.gz 2026-03-23T18:34:58.095 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.41206.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41206.log.gz 2026-03-23T18:34:58.095 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90577.log 2026-03-23T18:34:58.095 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71991.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38123.log 2026-03-23T18:34:58.095 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71991.log.gz 2026-03-23T18:34:58.096 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74648.log 2026-03-23T18:34:58.096 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90577.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90577.log.gz 2026-03-23T18:34:58.096 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42364.log 2026-03-23T18:34:58.096 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.38123.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46207.log 2026-03-23T18:34:58.096 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74648.log: 0.0% 26.4% -- replaced with /var/log/ceph/ceph-client.admin.38123.log.gz 2026-03-23T18:34:58.096 INFO:teuthology.orchestra.run.vm04.stderr: -- replaced with /var/log/ceph/ceph-client.admin.74648.log.gz 2026-03-23T18:34:58.096 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42364.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80425.log 2026-03-23T18:34:58.096 INFO:teuthology.orchestra.run.vm04.stderr: 17.2% -- replaced with /var/log/ceph/ceph-client.admin.42364.log.gz 2026-03-23T18:34:58.097 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76714.log 2026-03-23T18:34:58.097 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46207.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46207.log.gz 2026-03-23T18:34:58.097 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64591.log 2026-03-23T18:34:58.097 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80425.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80425.log.gz 2026-03-23T18:34:58.097 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76714.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56871.log 2026-03-23T18:34:58.097 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76714.log.gz 2026-03-23T18:34:58.097 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64591.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64591.log.gz 2026-03-23T18:34:58.097 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54501.log 2026-03-23T18:34:58.098 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46845.log 2026-03-23T18:34:58.098 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56871.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45258.log 2026-03-23T18:34:58.098 INFO:teuthology.orchestra.run.vm04.stderr: 29.4% -- replaced with /var/log/ceph/ceph-client.admin.56871.log.gz 2026-03-23T18:34:58.098 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.54501.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54501.log.gz 2026-03-23T18:34:58.098 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46845.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46845.log.gz 2026-03-23T18:34:58.098 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73740.log 2026-03-23T18:34:58.099 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.45258.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45258.log.gz 2026-03-23T18:34:58.099 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69624.log 2026-03-23T18:34:58.099 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73740.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83129.log 2026-03-23T18:34:58.099 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73740.log.gz 2026-03-23T18:34:58.099 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65538.log 2026-03-23T18:34:58.099 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69624.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69624.log.gz 2026-03-23T18:34:58.099 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-osd.0.log 2026-03-23T18:34:58.099 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83129.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83129.log.gz 2026-03-23T18:34:58.099 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65538.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65124.log 2026-03-23T18:34:58.100 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65538.log.gz 2026-03-23T18:34:58.100 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-osd.0.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74927.log 2026-03-23T18:34:58.100 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76346.log 2026-03-23T18:34:58.100 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59523.log 2026-03-23T18:34:58.100 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74927.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74927.log.gz 2026-03-23T18:34:58.101 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76346.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76346.log.gz 2026-03-23T18:34:58.103 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65124.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65124.log.gz 2026-03-23T18:34:58.103 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32286.log 2026-03-23T18:34:58.103 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59523.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38484.log 2026-03-23T18:34:58.103 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59523.log.gz 2026-03-23T18:34:58.103 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32286.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32286.log.gz 2026-03-23T18:34:58.115 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80683.log 2026-03-23T18:34:58.115 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.38484.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71970.log 2026-03-23T18:34:58.115 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80683.log: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.38484.log.gz 2026-03-23T18:34:58.115 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80683.log.gz 2026-03-23T18:34:58.122 INFO:teuthology.orchestra.run.vm04.stderr: 92.3% -- replaced with /var/log/ceph/ceph-mgr.x.log.gz 2026-03-23T18:34:58.122 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62852.log 2026-03-23T18:34:58.123 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71970.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56785.log 2026-03-23T18:34:58.123 INFO:teuthology.orchestra.run.vm04.stderr: 91.2% -- replaced with /var/log/ceph/ceph-client.admin.71970.log.gz 2026-03-23T18:34:58.123 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74828.log 2026-03-23T18:34:58.123 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62852.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62852.log.gz 2026-03-23T18:34:58.123 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56785.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38862.log 2026-03-23T18:34:58.123 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56785.log.gz 2026-03-23T18:34:58.123 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74828.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78525.log 2026-03-23T18:34:58.123 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74828.log.gz 2026-03-23T18:34:58.124 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64414.log 2026-03-23T18:34:58.124 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.38862.log: /var/log/ceph/ceph-client.admin.78525.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45820.log 2026-03-23T18:34:58.124 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78525.log.gz 2026-03-23T18:34:58.124 INFO:teuthology.orchestra.run.vm04.stderr: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.38862.log.gz 2026-03-23T18:34:58.124 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64414.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64414.log.gz 2026-03-23T18:34:58.124 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67622.log 2026-03-23T18:34:58.124 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.45820.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61337.log 2026-03-23T18:34:58.124 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45820.log.gz 2026-03-23T18:34:58.125 INFO:teuthology.orchestra.run.vm04.stderr:gzip/var/log/ceph/ceph-client.admin.67622.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.68458.log 2026-03-23T18:34:58.125 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67622.log.gz 2026-03-23T18:34:58.125 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61337.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82871.log 2026-03-23T18:34:58.125 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61337.log.gz 2026-03-23T18:34:58.125 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68458.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68458.log.gz 2026-03-23T18:34:58.126 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85046.log 2026-03-23T18:34:58.127 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82871.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91541.log 2026-03-23T18:34:58.127 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82871.log.gz 2026-03-23T18:34:58.127 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85046.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41356.log 2026-03-23T18:34:58.127 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85046.log.gz 2026-03-23T18:34:58.127 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26784.log 2026-03-23T18:34:58.127 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91541.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91541.log.gz 2026-03-23T18:34:58.127 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.41356.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27898.log 2026-03-23T18:34:58.127 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41356.log.gz 2026-03-23T18:34:58.128 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.26784.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74667.log 2026-03-23T18:34:58.128 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26784.log.gz 2026-03-23T18:34:58.128 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55163.log 2026-03-23T18:34:58.128 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27898.log: /var/log/ceph/ceph-client.admin.74667.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80834.log 2026-03-23T18:34:58.128 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74667.log.gz 2026-03-23T18:34:58.128 INFO:teuthology.orchestra.run.vm04.stderr: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.27898.log.gz 2026-03-23T18:34:58.128 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.55163.log: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.55163.log.gz 2026-03-23T18:34:58.130 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59308.log 2026-03-23T18:34:58.131 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80834.log: gzip 0.0% -5 --verbose -- replaced with /var/log/ceph/ceph-client.admin.80834.log.gz -- 2026-03-23T18:34:58.131 INFO:teuthology.orchestra.run.vm04.stderr: /var/log/ceph/ceph-client.admin.35371.log 2026-03-23T18:34:58.131 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59308.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36039.log 2026-03-23T18:34:58.131 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59308.log.gz 2026-03-23T18:34:58.131 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.35371.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73022.log 2026-03-23T18:34:58.131 INFO:teuthology.orchestra.run.vm04.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.35371.log.gz 2026-03-23T18:34:58.131 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36039.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62068.log 2026-03-23T18:34:58.131 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36039.log.gz 2026-03-23T18:34:58.132 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73022.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25479.log 2026-03-23T18:34:58.132 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73022.log.gz 2026-03-23T18:34:58.132 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62068.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66833.log 2026-03-23T18:34:58.132 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62068.log.gz 2026-03-23T18:34:58.132 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.25479.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36311.log 2026-03-23T18:34:58.132 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25479.log.gz 2026-03-23T18:34:58.132 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66833.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44492.log 2026-03-23T18:34:58.132 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66833.log.gz 2026-03-23T18:34:58.132 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36311.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39784.log 2026-03-23T18:34:58.132 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36311.log.gz 2026-03-23T18:34:58.133 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44492.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75128.log 2026-03-23T18:34:58.133 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.39784.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.44492.log.gz 2026-03-23T18:34:58.133 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62564.log 2026-03-23T18:34:58.133 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39784.log.gz 2026-03-23T18:34:58.133 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75128.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75128.log.gz 2026-03-23T18:34:58.134 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82028.log 2026-03-23T18:34:58.135 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62564.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91515.log 2026-03-23T18:34:58.135 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62564.log.gz 2026-03-23T18:34:58.135 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82028.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82028.log.gz 2026-03-23T18:34:58.138 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81794.log 2026-03-23T18:34:58.139 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91515.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91515.log.gz 2026-03-23T18:34:58.139 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61810.log 2026-03-23T18:34:58.139 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87097.log 2026-03-23T18:34:58.139 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81794.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81794.log.gz 2026-03-23T18:34:58.139 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86926.log 2026-03-23T18:34:58.139 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61810.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61810.log.gz 2026-03-23T18:34:58.139 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88212.log 2026-03-23T18:34:58.139 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87097.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87097.log.gz 2026-03-23T18:34:58.140 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86926.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86926.log.gz 2026-03-23T18:34:58.140 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83248.log 2026-03-23T18:34:58.140 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88212.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30214.log 2026-03-23T18:34:58.140 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88212.log.gz 2026-03-23T18:34:58.140 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83248.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59373.log 2026-03-23T18:34:58.140 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.30214.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40778.log 2026-03-23T18:34:58.140 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% 87.1% -- replaced with /var/log/ceph/ceph-client.admin.30214.log.gz 2026-03-23T18:34:58.140 INFO:teuthology.orchestra.run.vm04.stderr: -- replaced with /var/log/ceph/ceph-client.admin.83248.log.gz 2026-03-23T18:34:58.141 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59373.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59373.log.gz 2026-03-23T18:34:58.142 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66489.log 2026-03-23T18:34:58.143 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.40778.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40778.log.gz 2026-03-23T18:34:58.143 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82505.log 2026-03-23T18:34:58.143 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80361.log 2026-03-23T18:34:58.143 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66489.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66489.log.gz 2026-03-23T18:34:58.143 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27088.log 2026-03-23T18:34:58.143 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82505.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82505.log.gz 2026-03-23T18:34:58.143 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80361.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41313.log 2026-03-23T18:34:58.143 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80361.log.gz 2026-03-23T18:34:58.144 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27088.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27088.log.gz 2026-03-23T18:34:58.147 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55558.log 2026-03-23T18:34:58.147 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.41313.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35539.log 2026-03-23T18:34:58.147 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41313.log.gz 2026-03-23T18:34:58.147 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.55558.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50943.log 2026-03-23T18:34:58.147 INFO:teuthology.orchestra.run.vm04.stderr: 27.5% -- replaced with /var/log/ceph/ceph-client.admin.55558.log.gz/var/log/ceph/ceph-client.admin.35539.log: 2026-03-23T18:34:58.147 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74360.log 2026-03-23T18:34:58.147 INFO:teuthology.orchestra.run.vm04.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.35539.log.gz 2026-03-23T18:34:58.147 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50943.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50943.log.gz 2026-03-23T18:34:58.148 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91351.log 2026-03-23T18:34:58.148 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74360.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78280.log 2026-03-23T18:34:58.148 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74360.log.gz 2026-03-23T18:34:58.148 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78194.log 2026-03-23T18:34:58.148 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91351.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91351.log.gz 2026-03-23T18:34:58.148 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78280.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33402.log 2026-03-23T18:34:58.148 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78280.log.gz 2026-03-23T18:34:58.148 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78194.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56312.log 2026-03-23T18:34:58.148 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78194.log.gz 2026-03-23T18:34:58.149 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33402.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74518.log 2026-03-23T18:34:58.149 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33402.log.gz 2026-03-23T18:34:58.149 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56312.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27877.log 2026-03-23T18:34:58.149 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56312.log.gz 2026-03-23T18:34:58.149 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74518.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74518.log.gz 2026-03-23T18:34:58.149 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41722.log 2026-03-23T18:34:58.149 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27877.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30734.log 2026-03-23T18:34:58.150 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.41722.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31615.log 2026-03-23T18:34:58.150 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41722.log.gz 2026-03-23T18:34:58.150 INFO:teuthology.orchestra.run.vm04.stderr: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.27877.log.gz 2026-03-23T18:34:58.150 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.30734.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30734.log.gz 2026-03-23T18:34:58.150 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64833.log 2026-03-23T18:34:58.151 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31615.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45949.log 2026-03-23T18:34:58.151 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31615.log.gz 2026-03-23T18:34:58.151 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64833.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87445.log 2026-03-23T18:34:58.151 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64833.log.gz 2026-03-23T18:34:58.151 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26017.log 2026-03-23T18:34:58.151 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.45949.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45949.log.gz 2026-03-23T18:34:58.151 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87445.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43546.log 2026-03-23T18:34:58.151 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87445.log.gz 2026-03-23T18:34:58.151 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.26017.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26017.log.gz 2026-03-23T18:34:58.154 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57886.log 2026-03-23T18:34:58.155 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43546.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32558.log 2026-03-23T18:34:58.155 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43546.log.gz 2026-03-23T18:34:58.155 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.57886.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64869.log 2026-03-23T18:34:58.155 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57886.log.gz 2026-03-23T18:34:58.155 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32558.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49024.log 2026-03-23T18:34:58.155 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32558.log.gz 2026-03-23T18:34:58.155 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5/var/log/ceph/ceph-client.admin.64869.log: --verbose -- /var/log/ceph/ceph-client.admin.28607.log 2026-03-23T18:34:58.155 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64869.log.gz 2026-03-23T18:34:58.156 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31142.log 2026-03-23T18:34:58.156 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49024.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49024.log.gz 2026-03-23T18:34:58.156 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.28607.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76542.log 2026-03-23T18:34:58.156 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28607.log.gz 2026-03-23T18:34:58.156 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31142.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31142.log.gz 2026-03-23T18:34:58.158 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60682.log 2026-03-23T18:34:58.159 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76542.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76542.log.gz 2026-03-23T18:34:58.159 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77392.log 2026-03-23T18:34:58.159 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72054.log 2026-03-23T18:34:58.159 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60682.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60682.log.gz 2026-03-23T18:34:58.159 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39720.log 2026-03-23T18:34:58.159 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77392.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77392.log.gz 2026-03-23T18:34:58.159 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose --/var/log/ceph/ceph-client.admin.72054.log: /var/log/ceph/ceph-client.admin.70366.log 2026-03-23T18:34:58.159 INFO:teuthology.orchestra.run.vm04.stderr: 58.8% -- replaced with /var/log/ceph/ceph-client.admin.72054.log.gz 2026-03-23T18:34:58.160 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.39720.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39720.log.gz 2026-03-23T18:34:58.160 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88730.log 2026-03-23T18:34:58.160 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75343.log 2026-03-23T18:34:58.160 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70366.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70366.log.gz 2026-03-23T18:34:58.160 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88730.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28241.log 2026-03-23T18:34:58.160 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88730.log.gz 2026-03-23T18:34:58.160 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75343.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34006.log 2026-03-23T18:34:58.160 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75343.log.gz 2026-03-23T18:34:58.160 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.28241.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64531.log 2026-03-23T18:34:58.160 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28241.log.gz 2026-03-23T18:34:58.161 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34006.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44309.log 2026-03-23T18:34:58.161 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64531.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83923.log 2026-03-23T18:34:58.161 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64531.log.gz 2026-03-23T18:34:58.161 INFO:teuthology.orchestra.run.vm04.stderr: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.34006.log.gz 2026-03-23T18:34:58.161 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44309.log: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.44309.log.gz 2026-03-23T18:34:58.162 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78442.log 2026-03-23T18:34:58.163 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83923.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78425.log 2026-03-23T18:34:58.163 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83923.log.gz 2026-03-23T18:34:58.163 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78442.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31293.log 2026-03-23T18:34:58.163 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78442.log.gz 2026-03-23T18:34:58.163 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78425.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78425.log.gz 2026-03-23T18:34:58.166 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89186.log 2026-03-23T18:34:58.167 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36583.log 2026-03-23T18:34:58.167 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31293.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31293.log.gz 2026-03-23T18:34:58.167 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89186.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69748.log 2026-03-23T18:34:58.167 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89186.log.gz 2026-03-23T18:34:58.167 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36583.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78691.log 2026-03-23T18:34:58.167 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36583.log.gz 2026-03-23T18:34:58.167 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69748.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64026.log 2026-03-23T18:34:58.167 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69748.log.gz 2026-03-23T18:34:58.168 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78691.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66855.log 2026-03-23T18:34:58.168 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78691.log.gz 2026-03-23T18:34:58.168 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28177.log 2026-03-23T18:34:58.168 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64026.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64026.log.gz 2026-03-23T18:34:58.168 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66855.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62004.log 2026-03-23T18:34:58.168 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66855.log.gz 2026-03-23T18:34:58.168 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.28177.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33712.log 2026-03-23T18:34:58.168 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28177.log.gz 2026-03-23T18:34:58.168 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62004.log: gzip 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62004.log.gz -5 2026-03-23T18:34:58.168 INFO:teuthology.orchestra.run.vm04.stderr: --verbose -- /var/log/ceph/ceph-client.admin.46486.log 2026-03-23T18:34:58.169 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33712.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80339.log 2026-03-23T18:34:58.169 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33712.log.gz 2026-03-23T18:34:58.169 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46486.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77208.log 2026-03-23T18:34:58.169 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46486.log.gz 2026-03-23T18:34:58.169 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80339.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80339.log.gz 2026-03-23T18:34:58.169 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71630.log 2026-03-23T18:34:58.169 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82203.log 2026-03-23T18:34:58.169 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77208.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77208.log.gz 2026-03-23T18:34:58.169 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44792.log 2026-03-23T18:34:58.170 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71630.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71630.log.gz 2026-03-23T18:34:58.170 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82203.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82203.log.gz 2026-03-23T18:34:58.170 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62680.log 2026-03-23T18:34:58.170 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25992.log 2026-03-23T18:34:58.170 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44792.log: 25.6%/var/log/ceph/ceph-client.admin.62680.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40193.log 2026-03-23T18:34:58.170 INFO:teuthology.orchestra.run.vm04.stderr: -- replaced with /var/log/ceph/ceph-client.admin.44792.log.gz 2026-03-23T18:34:58.170 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62680.log.gz 2026-03-23T18:34:58.170 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.25992.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25992.log.gz 2026-03-23T18:34:58.171 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36906.log 2026-03-23T18:34:58.171 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.40193.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35014.log 2026-03-23T18:34:58.171 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40193.log.gz 2026-03-23T18:34:58.171 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29398.log 2026-03-23T18:34:58.171 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36906.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36906.log.gz 2026-03-23T18:34:58.171 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.35014.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68759.log 2026-03-23T18:34:58.171 INFO:teuthology.orchestra.run.vm04.stderr: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.35014.log.gz 2026-03-23T18:34:58.171 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29398.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29398.log.gz 2026-03-23T18:34:58.172 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63066.log 2026-03-23T18:34:58.172 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68759.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31164.log 2026-03-23T18:34:58.172 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68759.log.gz 2026-03-23T18:34:58.172 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82892.log 2026-03-23T18:34:58.172 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63066.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63066.log.gz 2026-03-23T18:34:58.172 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31164.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56204.log 2026-03-23T18:34:58.172 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31164.log.gz 2026-03-23T18:34:58.173 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82892.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82892.log.gz 2026-03-23T18:34:58.173 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84637.log 2026-03-23T18:34:58.173 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56204.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60456.log 2026-03-23T18:34:58.173 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56204.log.gz 2026-03-23T18:34:58.173 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67027.log 2026-03-23T18:34:58.173 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84637.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84637.log.gz 2026-03-23T18:34:58.173 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60456.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36056.log 2026-03-23T18:34:58.173 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60456.log.gz 2026-03-23T18:34:58.174 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67027.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41464.log 2026-03-23T18:34:58.174 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67027.log.gz 2026-03-23T18:34:58.174 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83987.log 2026-03-23T18:34:58.174 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36056.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36056.log.gz 2026-03-23T18:34:58.174 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.41464.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88992.log 2026-03-23T18:34:58.174 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41464.log.gz 2026-03-23T18:34:58.174 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83987.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26591.log 2026-03-23T18:34:58.174 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83987.log.gz 2026-03-23T18:34:58.174 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89616.log 2026-03-23T18:34:58.175 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88992.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88992.log.gz 2026-03-23T18:34:58.175 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.26591.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28873.log 2026-03-23T18:34:58.175 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26591.log.gz 2026-03-23T18:34:58.175 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89616.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79592.log 2026-03-23T18:34:58.175 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89616.log.gz 2026-03-23T18:34:58.175 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62090.log 2026-03-23T18:34:58.175 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.28873.log: /var/log/ceph/ceph-client.admin.79592.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36600.log 2026-03-23T18:34:58.175 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79592.log.gz 2026-03-23T18:34:58.175 INFO:teuthology.orchestra.run.vm04.stderr: 5.3% -- replaced with /var/log/ceph/ceph-client.admin.28873.log.gz 2026-03-23T18:34:58.176 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62090.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62090.log.gz 2026-03-23T18:34:58.176 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90427.log 2026-03-23T18:34:58.176 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36600.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47334.log 2026-03-23T18:34:58.176 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36600.log.gz 2026-03-23T18:34:58.176 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90427.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53546.log 2026-03-23T18:34:58.176 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90427.log.gz 2026-03-23T18:34:58.176 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.47334.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88598.log 2026-03-23T18:34:58.176 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47334.log.gz 2026-03-23T18:34:58.177 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37110.log 2026-03-23T18:34:58.177 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.53546.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53546.log.gz/var/log/ceph/ceph-client.admin.88598.log: 2026-03-23T18:34:58.177 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41163.log 2026-03-23T18:34:58.177 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88598.log.gz 2026-03-23T18:34:58.177 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37110.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76499.log 2026-03-23T18:34:58.177 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37110.log.gz 2026-03-23T18:34:58.177 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.41163.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41163.log.gzgzip 2026-03-23T18:34:58.177 INFO:teuthology.orchestra.run.vm04.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.31851.log 2026-03-23T18:34:58.177 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88971.log 2026-03-23T18:34:58.178 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76499.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76499.log.gz 2026-03-23T18:34:58.178 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31851.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28091.log 2026-03-23T18:34:58.178 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31851.log.gz 2026-03-23T18:34:58.178 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88971.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88971.log.gz 2026-03-23T18:34:58.178 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42140.log 2026-03-23T18:34:58.178 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.28091.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28091.log.gz 2026-03-23T18:34:58.178 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46963.log 2026-03-23T18:34:58.179 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42140.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63260.log 2026-03-23T18:34:58.179 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42140.log.gz 2026-03-23T18:34:58.179 INFO:teuthology.orchestra.run.vm04.stderr:gzip/var/log/ceph/ceph-client.admin.46963.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.87310.log 2026-03-23T18:34:58.179 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46963.log.gz 2026-03-23T18:34:58.179 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63260.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44168.log 2026-03-23T18:34:58.179 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63260.log.gz 2026-03-23T18:34:58.179 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42704.log 2026-03-23T18:34:58.179 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87310.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87310.log.gz 2026-03-23T18:34:58.179 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose --/var/log/ceph/ceph-client.admin.44168.log: /var/log/ceph/ceph-client.admin.85067.log 2026-03-23T18:34:58.179 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44168.log.gz 2026-03-23T18:34:58.180 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42704.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25635.log 2026-03-23T18:34:58.180 INFO:teuthology.orchestra.run.vm04.stderr: 25.3% -- replaced with /var/log/ceph/ceph-client.admin.42704.log.gz 2026-03-23T18:34:58.180 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53826.log 2026-03-23T18:34:58.180 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85067.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85067.log.gz 2026-03-23T18:34:58.180 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37246.log 2026-03-23T18:34:58.180 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.25635.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25635.log.gz 2026-03-23T18:34:58.180 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.53826.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53826.log.gz 2026-03-23T18:34:58.181 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30235.log 2026-03-23T18:34:58.181 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37246.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37246.log.gz 2026-03-23T18:34:58.181 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79819.log 2026-03-23T18:34:58.181 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.30235.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30364.log 2026-03-23T18:34:58.181 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30235.log.gz 2026-03-23T18:34:58.181 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73452.log 2026-03-23T18:34:58.181 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79819.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79819.log.gz 2026-03-23T18:34:58.181 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.30364.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30364.log.gz 2026-03-23T18:34:58.181 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72769.log 2026-03-23T18:34:58.182 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73452.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33625.log 2026-03-23T18:34:58.182 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73452.log.gz 2026-03-23T18:34:58.182 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72769.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39698.log 2026-03-23T18:34:58.182 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72769.log.gz 2026-03-23T18:34:58.182 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33625.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55752.log 2026-03-23T18:34:58.182 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.39698.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39698.log.gz 2026-03-23T18:34:58.182 INFO:teuthology.orchestra.run.vm04.stderr: 56.3% -- replaced with /var/log/ceph/ceph-client.admin.33625.log.gz 2026-03-23T18:34:58.183 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90346.log 2026-03-23T18:34:58.183 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.55752.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58732.log 2026-03-23T18:34:58.183 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55752.log.gz 2026-03-23T18:34:58.183 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90346.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27180.log 2026-03-23T18:34:58.183 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90346.log.gz 2026-03-23T18:34:58.183 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58732.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36617.log 2026-03-23T18:34:58.183 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58732.log.gz 2026-03-23T18:34:58.183 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27180.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.27180.log.gz 2026-03-23T18:34:58.183 INFO:teuthology.orchestra.run.vm04.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.72193.log 2026-03-23T18:34:58.184 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36617.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75413.log 2026-03-23T18:34:58.184 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36617.log.gz 2026-03-23T18:34:58.184 INFO:teuthology.orchestra.run.vm04.stderr:gzip/var/log/ceph/ceph-client.admin.72193.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.62503.log 2026-03-23T18:34:58.184 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72193.log.gz 2026-03-23T18:34:58.184 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75413.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75413.log.gz 2026-03-23T18:34:58.184 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33419.log 2026-03-23T18:34:58.184 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62503.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45237.log 2026-03-23T18:34:58.184 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62503.log.gz 2026-03-23T18:34:58.185 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73843.log 2026-03-23T18:34:58.185 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33419.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33419.log.gz 2026-03-23T18:34:58.185 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.45237.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45237.log.gz 2026-03-23T18:34:58.185 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87724.log 2026-03-23T18:34:58.185 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79141.log 2026-03-23T18:34:58.185 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73843.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73843.log.gz 2026-03-23T18:34:58.185 INFO:teuthology.orchestra.run.vm04.stderr:gzip/var/log/ceph/ceph-client.admin.87724.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.35497.log 2026-03-23T18:34:58.185 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87724.log.gz 2026-03-23T18:34:58.185 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79141.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79141.log.gz 2026-03-23T18:34:58.185 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55902.log 2026-03-23T18:34:58.186 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.35497.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86775.log 2026-03-23T18:34:58.186 INFO:teuthology.orchestra.run.vm04.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.35497.log.gz 2026-03-23T18:34:58.186 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.55902.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57907.log 2026-03-23T18:34:58.186 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55902.log.gz 2026-03-23T18:34:58.186 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86775.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86775.log.gz 2026-03-23T18:34:58.186 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57843.log 2026-03-23T18:34:58.186 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83254.log 2026-03-23T18:34:58.187 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.57843.log: /var/log/ceph/ceph-client.admin.57907.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72416.log 2026-03-23T18:34:58.187 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57843.log.gz 2026-03-23T18:34:58.187 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57907.log.gz 2026-03-23T18:34:58.187 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83254.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64374.log 2026-03-23T18:34:58.187 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83254.log.gz 2026-03-23T18:34:58.187 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72416.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32490.log 2026-03-23T18:34:58.187 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72416.log.gz 2026-03-23T18:34:58.187 INFO:teuthology.orchestra.run.vm04.stderr:gzip/var/log/ceph/ceph-client.admin.64374.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.70624.log 2026-03-23T18:34:58.187 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64374.log.gz 2026-03-23T18:34:58.187 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75289.log 2026-03-23T18:34:58.187 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32490.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32490.log.gz 2026-03-23T18:34:58.188 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70624.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.70624.log.gz -5 2026-03-23T18:34:58.188 INFO:teuthology.orchestra.run.vm04.stderr: --verbose -- /var/log/ceph/ceph-client.admin.82462.log 2026-03-23T18:34:58.188 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38610.log 2026-03-23T18:34:58.188 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75289.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75289.log.gz 2026-03-23T18:34:58.188 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61187.log 2026-03-23T18:34:58.188 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82462.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82462.log.gz 2026-03-23T18:34:58.188 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.38610.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76800.log 2026-03-23T18:34:58.188 INFO:teuthology.orchestra.run.vm04.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.38610.log.gz 2026-03-23T18:34:58.189 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61187.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61187.log.gz 2026-03-23T18:34:58.189 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45992.log 2026-03-23T18:34:58.189 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76800.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35056.log 2026-03-23T18:34:58.189 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76800.log.gz 2026-03-23T18:34:58.189 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35182.log 2026-03-23T18:34:58.189 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.45992.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45992.log.gz 2026-03-23T18:34:58.189 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82527.log 2026-03-23T18:34:58.189 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.35056.log: /var/log/ceph/ceph-client.admin.35182.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66321.log 2026-03-23T18:34:58.189 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35182.log.gz 2026-03-23T18:34:58.190 INFO:teuthology.orchestra.run.vm04.stderr: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.35056.log.gz 2026-03-23T18:34:58.190 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82527.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29870.log 2026-03-23T18:34:58.190 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82527.log.gz 2026-03-23T18:34:58.190 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66321.log: gzip -5 --verbose 0.0% -- /var/log/ceph/ceph-client.admin.27983.log -- replaced with /var/log/ceph/ceph-client.admin.66321.log.gz 2026-03-23T18:34:58.190 INFO:teuthology.orchestra.run.vm04.stderr: 2026-03-23T18:34:58.190 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47076.log 2026-03-23T18:34:58.190 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29870.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29870.log.gz 2026-03-23T18:34:58.190 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56355.log 2026-03-23T18:34:58.190 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27983.log: /var/log/ceph/ceph-client.admin.47076.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55537.log 2026-03-23T18:34:58.191 INFO:teuthology.orchestra.run.vm04.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.27983.log.gz 2026-03-23T18:34:58.191 INFO:teuthology.orchestra.run.vm04.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.47076.log.gz 2026-03-23T18:34:58.191 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56355.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56355.log.gz 2026-03-23T18:34:58.191 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32507.log 2026-03-23T18:34:58.191 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66049.log 2026-03-23T18:34:58.191 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.55537.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55537.log.gz 2026-03-23T18:34:58.191 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32960.log 2026-03-23T18:34:58.192 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32507.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32507.log.gz 2026-03-23T18:34:58.192 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66049.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66049.log.gz 2026-03-23T18:34:58.192 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82113.log 2026-03-23T18:34:58.192 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32960.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36770.log 2026-03-23T18:34:58.192 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32960.log.gz 2026-03-23T18:34:58.192 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82113.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76843.log 2026-03-23T18:34:58.192 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82113.log.gz 2026-03-23T18:34:58.192 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36770.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36770.log.gz 2026-03-23T18:34:58.192 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66941.log 2026-03-23T18:34:58.193 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76067.log 2026-03-23T18:34:58.193 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76843.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76843.log.gz 2026-03-23T18:34:58.193 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66941.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87359.log 2026-03-23T18:34:58.193 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66941.log.gz 2026-03-23T18:34:58.193 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76067.log: gzip 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76067.log.gz -5 2026-03-23T18:34:58.193 INFO:teuthology.orchestra.run.vm04.stderr: --verbose -- /var/log/ceph/ceph-client.admin.34867.log 2026-03-23T18:34:58.193 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83197.log 2026-03-23T18:34:58.193 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87359.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87359.log.gz 2026-03-23T18:34:58.193 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27360.log 2026-03-23T18:34:58.194 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34867.log: /var/log/ceph/ceph-client.admin.83197.log: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.34867.log.gz 2026-03-23T18:34:58.194 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83197.log.gz 2026-03-23T18:34:58.194 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51497.log 2026-03-23T18:34:58.194 INFO:teuthology.orchestra.run.vm04.stderr:gzip/var/log/ceph/ceph-client.admin.27360.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.64511.log 2026-03-23T18:34:58.194 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27360.log.gz 2026-03-23T18:34:58.194 INFO:teuthology.orchestra.run.vm04.stderr:gzip/var/log/ceph/ceph-client.admin.51497.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.45645.log 2026-03-23T18:34:58.194 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51497.log.gz 2026-03-23T18:34:58.194 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64511.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40128.log 2026-03-23T18:34:58.194 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64511.log.gz 2026-03-23T18:34:58.195 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.45645.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45645.log.gz 2026-03-23T18:34:58.195 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62452.log 2026-03-23T18:34:58.195 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.40128.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69869.log 2026-03-23T18:34:58.195 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40128.log.gz 2026-03-23T18:34:58.195 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35077.log 2026-03-23T18:34:58.195 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62452.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62452.log.gz 2026-03-23T18:34:58.196 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69869.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82570.log 2026-03-23T18:34:58.196 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69869.log.gz 2026-03-23T18:34:58.196 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.35077.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47614.log 2026-03-23T18:34:58.196 INFO:teuthology.orchestra.run.vm04.stderr: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.35077.log.gz 2026-03-23T18:34:58.196 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82570.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82570.log.gz 2026-03-23T18:34:58.196 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76410.log 2026-03-23T18:34:58.196 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.47614.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73791.log 2026-03-23T18:34:58.196 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47614.log.gz 2026-03-23T18:34:58.196 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76410.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75809.log 2026-03-23T18:34:58.196 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76410.log.gz 2026-03-23T18:34:58.197 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68694.log 2026-03-23T18:34:58.197 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73791.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73791.log.gz 2026-03-23T18:34:58.197 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose --/var/log/ceph/ceph-client.admin.75809.log: /var/log/ceph/ceph-client.admin.38547.log 2026-03-23T18:34:58.197 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75809.log.gz 2026-03-23T18:34:58.197 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68694.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40558.log 2026-03-23T18:34:58.197 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68694.log.gz 2026-03-23T18:34:58.197 INFO:teuthology.orchestra.run.vm04.stderr:gzip/var/log/ceph/ceph-client.admin.38547.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.65241.log 2026-03-23T18:34:58.197 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.40558.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82656.log 2026-03-23T18:34:58.198 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40558.log.gz 2026-03-23T18:34:58.198 INFO:teuthology.orchestra.run.vm04.stderr: 27.1% -- replaced with /var/log/ceph/ceph-client.admin.38547.log.gz 2026-03-23T18:34:58.198 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65241.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65241.log.gz 2026-03-23T18:34:58.198 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63377.log 2026-03-23T18:34:58.198 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26892.log 2026-03-23T18:34:58.199 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82656.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49815.log 2026-03-23T18:34:58.199 INFO:teuthology.orchestra.run.vm04.stderr: 0.0%/var/log/ceph/ceph-client.admin.63377.log: -- replaced with /var/log/ceph/ceph-client.admin.82656.log.gz 2026-03-23T18:34:58.199 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63377.log.gz 2026-03-23T18:34:58.199 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.26892.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26892.log.gz 2026-03-23T18:34:58.199 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51605.log 2026-03-23T18:34:58.199 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49815.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39827.log 2026-03-23T18:34:58.199 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49815.log.gz 2026-03-23T18:34:58.199 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.51605.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91619.log 2026-03-23T18:34:58.199 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51605.log.gz 2026-03-23T18:34:58.199 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27162.log 2026-03-23T18:34:58.199 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.39827.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39827.log.gz 2026-03-23T18:34:58.200 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91619.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.91619.log.gz 2026-03-23T18:34:58.200 INFO:teuthology.orchestra.run.vm04.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.79609.log 2026-03-23T18:34:58.200 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79385.log 2026-03-23T18:34:58.200 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27162.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27162.log.gz 2026-03-23T18:34:58.200 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79609.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89014.log 2026-03-23T18:34:58.200 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79609.log.gz 2026-03-23T18:34:58.200 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79385.log: gzip -5 0.0% --verbose -- -- replaced with /var/log/ceph/ceph-client.admin.79385.log.gz /var/log/ceph/ceph-client.admin.64633.log 2026-03-23T18:34:58.200 INFO:teuthology.orchestra.run.vm04.stderr: 2026-03-23T18:34:58.200 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89014.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88928.log 2026-03-23T18:34:58.201 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89014.log.gz 2026-03-23T18:34:58.201 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75481.log 2026-03-23T18:34:58.201 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64633.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64633.log.gz 2026-03-23T18:34:58.201 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88928.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88928.log.gz 2026-03-23T18:34:58.201 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45734.log 2026-03-23T18:34:58.201 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64239.log 2026-03-23T18:34:58.201 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75481.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75481.log.gz 2026-03-23T18:34:58.201 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.45734.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70323.log 2026-03-23T18:34:58.201 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45734.log.gz 2026-03-23T18:34:58.201 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64239.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64239.log.gz 2026-03-23T18:34:58.201 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62816.log 2026-03-23T18:34:58.202 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55365.log 2026-03-23T18:34:58.202 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70323.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70323.log.gz 2026-03-23T18:34:58.202 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49514.log 2026-03-23T18:34:58.202 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62816.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62816.log.gz 2026-03-23T18:34:58.202 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.55365.log: gzip -5 --verbose 0.0% -- /var/log/ceph/ceph-client.admin.69989.log -- replaced with /var/log/ceph/ceph-client.admin.55365.log.gz 2026-03-23T18:34:58.202 INFO:teuthology.orchestra.run.vm04.stderr: 2026-03-23T18:34:58.202 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49514.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45667.log 2026-03-23T18:34:58.202 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49514.log.gz 2026-03-23T18:34:58.202 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59545.log 2026-03-23T18:34:58.203 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69989.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69989.log.gz 2026-03-23T18:34:58.203 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.45667.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45667.log.gz 2026-03-23T18:34:58.203 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51712.log 2026-03-23T18:34:58.203 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59545.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84895.log 2026-03-23T18:34:58.203 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59545.log.gz 2026-03-23T18:34:58.203 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69499.log 2026-03-23T18:34:58.203 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.51712.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51712.log.gz 2026-03-23T18:34:58.203 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84895.log: gzip -5 --verbose 0.0% -- /var/log/ceph/ceph-client.admin.77635.log -- replaced with /var/log/ceph/ceph-client.admin.84895.log.gz 2026-03-23T18:34:58.203 INFO:teuthology.orchestra.run.vm04.stderr: 2026-03-23T18:34:58.204 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69499.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68329.log 2026-03-23T18:34:58.204 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69499.log.gz 2026-03-23T18:34:58.204 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77635.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84702.log 2026-03-23T18:34:58.204 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77635.log.gz 2026-03-23T18:34:58.204 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68329.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26042.log 2026-03-23T18:34:58.204 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68329.log.gz 2026-03-23T18:34:58.204 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84702.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59394.log 2026-03-23T18:34:58.204 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84702.log.gz 2026-03-23T18:34:58.204 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44893.log 2026-03-23T18:34:58.204 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.26042.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26042.log.gz 2026-03-23T18:34:58.205 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59394.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.59394.log.gz -5 2026-03-23T18:34:58.205 INFO:teuthology.orchestra.run.vm04.stderr: --verbose -- /var/log/ceph/ceph-client.admin.48255.log 2026-03-23T18:34:58.205 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87011.log 2026-03-23T18:34:58.205 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44893.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44893.log.gz 2026-03-23T18:34:58.205 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose/var/log/ceph/ceph-client.admin.48255.log: -- /var/log/ceph/ceph-client.admin.25866.log 2026-03-23T18:34:58.205 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48255.log.gz 2026-03-23T18:34:58.205 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87011.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87011.log.gz 2026-03-23T18:34:58.205 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33487.log 2026-03-23T18:34:58.206 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.25866.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33215.log 2026-03-23T18:34:58.206 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25866.log.gz 2026-03-23T18:34:58.206 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61595.log 2026-03-23T18:34:58.206 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33487.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33487.log.gz 2026-03-23T18:34:58.206 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33215.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81395.log 2026-03-23T18:34:58.206 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33215.log.gz 2026-03-23T18:34:58.206 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61595.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55242.log 2026-03-23T18:34:58.206 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61595.log.gz 2026-03-23T18:34:58.207 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81395.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81395.log.gz 2026-03-23T18:34:58.207 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79106.log 2026-03-23T18:34:58.207 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.55242.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53869.log 2026-03-23T18:34:58.207 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55242.log.gz 2026-03-23T18:34:58.207 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79106.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32722.log 2026-03-23T18:34:58.207 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79106.log.gz 2026-03-23T18:34:58.207 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37399.log 2026-03-23T18:34:58.208 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.53869.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53869.log.gz/var/log/ceph/ceph-client.admin.32722.log: 2026-03-23T18:34:58.208 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32722.log.gz 2026-03-23T18:34:58.208 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43028.log 2026-03-23T18:34:58.208 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37399.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27664.log 2026-03-23T18:34:58.208 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37399.log.gz 2026-03-23T18:34:58.208 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56290.log 2026-03-23T18:34:58.208 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43028.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43028.log.gz 2026-03-23T18:34:58.208 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27664.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72609.log 2026-03-23T18:34:58.208 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27664.log.gz 2026-03-23T18:34:58.208 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56290.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29140.log 2026-03-23T18:34:58.208 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56290.log.gz 2026-03-23T18:34:58.209 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26526.log 2026-03-23T18:34:58.209 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72609.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72609.log.gz 2026-03-23T18:34:58.209 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29140.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79733.log 2026-03-23T18:34:58.209 INFO:teuthology.orchestra.run.vm04.stderr: 27.2% -- replaced with /var/log/ceph/ceph-client.admin.29140.log.gz 2026-03-23T18:34:58.209 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.26526.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65888.log 2026-03-23T18:34:58.209 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26526.log.gz 2026-03-23T18:34:58.209 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89272.log 2026-03-23T18:34:58.209 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79733.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79733.log.gz 2026-03-23T18:34:58.210 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65888.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90539.log 2026-03-23T18:34:58.210 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65888.log.gz 2026-03-23T18:34:58.210 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89272.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46271.log 2026-03-23T18:34:58.210 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89272.log.gz 2026-03-23T18:34:58.210 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90539.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90539.log.gz 2026-03-23T18:34:58.210 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65064.log 2026-03-23T18:34:58.210 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46271.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73896.log 2026-03-23T18:34:58.210 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46271.log.gz 2026-03-23T18:34:58.210 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65064.log: gzip 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65064.log.gz -5 2026-03-23T18:34:58.211 INFO:teuthology.orchestra.run.vm04.stderr: --verbose -- /var/log/ceph/ceph-client.admin.72011.log 2026-03-23T18:34:58.211 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73896.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.73896.log.gz 2026-03-23T18:34:58.211 INFO:teuthology.orchestra.run.vm04.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.57950.log 2026-03-23T18:34:58.211 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72011.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66812.log 2026-03-23T18:34:58.211 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72011.log.gz 2026-03-23T18:34:58.211 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.57950.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70216.log 2026-03-23T18:34:58.211 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57950.log.gz 2026-03-23T18:34:58.211 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66812.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66812.log.gz 2026-03-23T18:34:58.211 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61380.log 2026-03-23T18:34:58.212 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73623.log 2026-03-23T18:34:58.212 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70216.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70216.log.gz 2026-03-23T18:34:58.212 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61380.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37008.log 2026-03-23T18:34:58.212 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61380.log.gz 2026-03-23T18:34:58.212 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73623.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73623.log.gz 2026-03-23T18:34:58.212 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56398.log 2026-03-23T18:34:58.212 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37008.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72456.log 2026-03-23T18:34:58.212 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37008.log.gz 2026-03-23T18:34:58.212 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46810.log 2026-03-23T18:34:58.212 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56398.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56398.log.gz 2026-03-23T18:34:58.213 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72456.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68307.log 2026-03-23T18:34:58.213 INFO:teuthology.orchestra.run.vm04.stderr: 25.3% -- replaced with /var/log/ceph/ceph-client.admin.72456.log.gz 2026-03-23T18:34:58.213 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79328.log 2026-03-23T18:34:58.213 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46810.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46810.log.gz 2026-03-23T18:34:58.213 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68307.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43340.log 2026-03-23T18:34:58.213 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68307.log.gz 2026-03-23T18:34:58.213 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79328.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.79328.log.gz -5 2026-03-23T18:34:58.213 INFO:teuthology.orchestra.run.vm04.stderr: --verbose -- /var/log/ceph/ceph-client.admin.48611.log 2026-03-23T18:34:58.214 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43340.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41743.log 2026-03-23T18:34:58.214 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43340.log.gz 2026-03-23T18:34:58.214 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45884.log 2026-03-23T18:34:58.214 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.48611.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48611.log.gz 2026-03-23T18:34:58.214 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.41743.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.41743.log.gz 2026-03-23T18:34:58.214 INFO:teuthology.orchestra.run.vm04.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.89551.log 2026-03-23T18:34:58.214 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.45884.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48861.log 2026-03-23T18:34:58.214 INFO:teuthology.orchestra.run.vm04.stderr: 0.0%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79904.log 2026-03-23T18:34:58.214 INFO:teuthology.orchestra.run.vm04.stderr: -- replaced with /var/log/ceph/ceph-client.admin.45884.log.gz 2026-03-23T18:34:58.215 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.48861.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48861.log.gz 2026-03-23T18:34:58.215 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89551.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89551.log.gz 2026-03-23T18:34:58.215 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26612.log 2026-03-23T18:34:58.215 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79904.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59416.log 2026-03-23T18:34:58.215 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79904.log.gz 2026-03-23T18:34:58.215 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54281.log 2026-03-23T18:34:58.216 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.26612.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26612.log.gz 2026-03-23T18:34:58.216 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59416.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59416.log.gz 2026-03-23T18:34:58.216 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70860.log 2026-03-23T18:34:58.216 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.54281.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66303.log 2026-03-23T18:34:58.216 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54281.log.gz 2026-03-23T18:34:58.216 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39591.log 2026-03-23T18:34:58.216 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70860.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70860.log.gz 2026-03-23T18:34:58.216 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66303.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27380.log 2026-03-23T18:34:58.216 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66303.log.gz 2026-03-23T18:34:58.216 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.39591.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67963.log 2026-03-23T18:34:58.217 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63398.log 2026-03-23T18:34:58.217 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27380.log: 27.1% -- replaced with /var/log/ceph/ceph-client.admin.39591.log.gz 2026-03-23T18:34:58.217 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27380.log.gz 2026-03-23T18:34:58.217 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67963.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67963.log.gz 2026-03-23T18:34:58.217 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-osd.1.log 2026-03-23T18:34:58.217 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63398.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39381.log 2026-03-23T18:34:58.217 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63398.log.gz 2026-03-23T18:34:58.217 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71501.log 2026-03-23T18:34:58.218 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-osd.1.log: /var/log/ceph/ceph-client.admin.39381.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47571.log 2026-03-23T18:34:58.218 INFO:teuthology.orchestra.run.vm04.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.39381.log.gz 2026-03-23T18:34:58.218 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71501.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71501.log.gz 2026-03-23T18:34:58.218 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86192.log 2026-03-23T18:34:58.218 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.47571.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44592.log 2026-03-23T18:34:58.218 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47571.log.gz 2026-03-23T18:34:58.218 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86192.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86192.log.gz 2026-03-23T18:34:58.219 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85827.log 2026-03-23T18:34:58.219 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44592.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33504.log 2026-03-23T18:34:58.219 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44592.log.gz 2026-03-23T18:34:58.219 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85827.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85827.log.gz 2026-03-23T18:34:58.219 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66468.log 2026-03-23T18:34:58.220 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33504.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49643.log 2026-03-23T18:34:58.220 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33504.log.gz 2026-03-23T18:34:58.220 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66468.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58545.log 2026-03-23T18:34:58.220 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66468.log.gz 2026-03-23T18:34:58.220 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49643.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49643.log.gz 2026-03-23T18:34:58.220 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50734.log 2026-03-23T18:34:58.220 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58545.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56161.log 2026-03-23T18:34:58.220 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58545.log.gz 2026-03-23T18:34:58.221 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50734.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57542.log 2026-03-23T18:34:58.221 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50734.log.gz 2026-03-23T18:34:58.221 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56161.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56161.log.gz 2026-03-23T18:34:58.221 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44389.log 2026-03-23T18:34:58.221 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.57542.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51239.log 2026-03-23T18:34:58.221 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57542.log.gz 2026-03-23T18:34:58.222 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44389.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62604.log 2026-03-23T18:34:58.222 INFO:teuthology.orchestra.run.vm04.stderr: 26.2% -- replaced with /var/log/ceph/ceph-client.admin.44389.log.gz 2026-03-23T18:34:58.222 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.51239.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51239.log.gz 2026-03-23T18:34:58.222 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78043.log 2026-03-23T18:34:58.222 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62604.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89745.log 2026-03-23T18:34:58.222 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62604.log.gz 2026-03-23T18:34:58.222 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78043.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78043.log.gz 2026-03-23T18:34:58.223 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79802.log 2026-03-23T18:34:58.223 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89745.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32592.log 2026-03-23T18:34:58.223 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89745.log.gz 2026-03-23T18:34:58.223 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79802.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57714.log 2026-03-23T18:34:58.223 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79802.log.gz 2026-03-23T18:34:58.223 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32592.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32592.log.gz 2026-03-23T18:34:58.224 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29376.log 2026-03-23T18:34:58.224 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.57714.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81811.log 2026-03-23T18:34:58.224 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57714.log.gz 2026-03-23T18:34:58.224 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29376.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40658.log 2026-03-23T18:34:58.224 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29376.log.gz 2026-03-23T18:34:58.224 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81811.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81811.log.gz 2026-03-23T18:34:58.224 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84573.log 2026-03-23T18:34:58.225 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.40658.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56763.log 2026-03-23T18:34:58.225 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40658.log.gz 2026-03-23T18:34:58.225 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84573.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57477.log 2026-03-23T18:34:58.225 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84573.log.gz 2026-03-23T18:34:58.225 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56763.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56763.log.gz 2026-03-23T18:34:58.225 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40797.log 2026-03-23T18:34:58.226 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.57477.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81922.log 2026-03-23T18:34:58.226 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57477.log.gz 2026-03-23T18:34:58.226 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.40797.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27853.log 2026-03-23T18:34:58.226 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40797.log.gz 2026-03-23T18:34:58.226 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81922.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81922.log.gz 2026-03-23T18:34:58.226 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34909.log 2026-03-23T18:34:58.227 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27853.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59158.log 2026-03-23T18:34:58.227 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27853.log.gz 2026-03-23T18:34:58.227 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34909.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.22700.log 2026-03-23T18:34:58.227 INFO:teuthology.orchestra.run.vm04.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.34909.log.gz 2026-03-23T18:34:58.227 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59158.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59158.log.gz 2026-03-23T18:34:58.227 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45906.log 2026-03-23T18:34:58.227 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.22700.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88799.log 2026-03-23T18:34:58.227 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.22700.log.gz 2026-03-23T18:34:58.228 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.45906.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45906.log.gz 2026-03-23T18:34:58.228 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77936.log 2026-03-23T18:34:58.228 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88799.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66661.log 2026-03-23T18:34:58.228 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88799.log.gz 2026-03-23T18:34:58.228 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77936.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77936.log.gz 2026-03-23T18:34:58.228 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39763.log 2026-03-23T18:34:58.228 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66661.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90132.log 2026-03-23T18:34:58.229 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66661.log.gz 2026-03-23T18:34:58.229 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.39763.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39763.log.gz 2026-03-23T18:34:58.229 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47678.log 2026-03-23T18:34:58.229 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90132.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68544.log 2026-03-23T18:34:58.229 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90132.log.gz 2026-03-23T18:34:58.229 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.47678.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47678.log.gz 2026-03-23T18:34:58.230 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83231.log 2026-03-23T18:34:58.230 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68544.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48060.log 2026-03-23T18:34:58.230 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68544.log.gz 2026-03-23T18:34:58.230 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83231.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83231.log.gz 2026-03-23T18:34:58.230 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29698.log 2026-03-23T18:34:58.231 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.48060.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46336.log 2026-03-23T18:34:58.231 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48060.log.gz 2026-03-23T18:34:58.231 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29698.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29698.log.gz 2026-03-23T18:34:58.231 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35665.log 2026-03-23T18:34:58.231 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46336.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66876.log 2026-03-23T18:34:58.231 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46336.log.gz 2026-03-23T18:34:58.231 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.35665.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.35665.log.gz 2026-03-23T18:34:58.232 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31829.log 2026-03-23T18:34:58.232 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66876.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55945.log 2026-03-23T18:34:58.232 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66876.log.gz 2026-03-23T18:34:58.232 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31829.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31829.log.gz 2026-03-23T18:34:58.232 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53718.log 2026-03-23T18:34:58.232 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.55945.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75017.log 2026-03-23T18:34:58.233 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55945.log.gz 2026-03-23T18:34:58.233 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.53718.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53718.log.gz 2026-03-23T18:34:58.233 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31786.log 2026-03-23T18:34:58.233 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75017.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73516.log 2026-03-23T18:34:58.233 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75017.log.gz 2026-03-23T18:34:58.233 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31786.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31786.log.gz 2026-03-23T18:34:58.234 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81460.log 2026-03-23T18:34:58.234 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73516.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84917.log 2026-03-23T18:34:58.234 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73516.log.gz 2026-03-23T18:34:58.234 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81460.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81460.log.gz 2026-03-23T18:34:58.234 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34762.log 2026-03-23T18:34:58.234 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84917.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40522.log 2026-03-23T18:34:58.235 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84917.log.gz 2026-03-23T18:34:58.235 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34762.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57778.log 2026-03-23T18:34:58.235 INFO:teuthology.orchestra.run.vm04.stderr: 26.9% -- replaced with /var/log/ceph/ceph-client.admin.34762.log.gz 2026-03-23T18:34:58.235 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.40522.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40522.log.gz 2026-03-23T18:34:58.235 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48038.log 2026-03-23T18:34:58.235 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.57778.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71880.log 2026-03-23T18:34:58.235 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57778.log.gz 2026-03-23T18:34:58.236 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.48038.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48038.log.gz 2026-03-23T18:34:58.236 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68479.log 2026-03-23T18:34:58.236 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71880.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27758.log 2026-03-23T18:34:58.236 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71880.log.gz 2026-03-23T18:34:58.236 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68479.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68479.log.gz 2026-03-23T18:34:58.236 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64551.log 2026-03-23T18:34:58.237 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27758.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75602.log 2026-03-23T18:34:58.237 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27758.log.gz 2026-03-23T18:34:58.237 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64551.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64551.log.gz 2026-03-23T18:34:58.237 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58630.log 2026-03-23T18:34:58.237 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75602.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88526.log 2026-03-23T18:34:58.237 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75602.log.gz 2026-03-23T18:34:58.238 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58630.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58630.log.gz 2026-03-23T18:34:58.238 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76865.log 2026-03-23T18:34:58.238 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88526.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85956.log 2026-03-23T18:34:58.238 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88526.log.gz 2026-03-23T18:34:58.238 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76865.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76865.log.gz 2026-03-23T18:34:58.239 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82613.log 2026-03-23T18:34:58.239 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85956.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44713.log 2026-03-23T18:34:58.239 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85956.log.gz 2026-03-23T18:34:58.239 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82613.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.82613.log.gz 2026-03-23T18:34:58.239 INFO:teuthology.orchestra.run.vm04.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.77613.log 2026-03-23T18:34:58.239 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44713.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44713.log.gz 2026-03-23T18:34:58.239 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32422.log 2026-03-23T18:34:58.240 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77613.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.22901.log 2026-03-23T18:34:58.240 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77613.log.gz 2026-03-23T18:34:58.240 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32422.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32422.log.gz 2026-03-23T18:34:58.240 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84981.log 2026-03-23T18:34:58.240 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.22901.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53740.log 2026-03-23T18:34:58.240 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.22901.log.gz 2026-03-23T18:34:58.241 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84981.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84981.log.gz 2026-03-23T18:34:58.241 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60702.log 2026-03-23T18:34:58.241 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.53740.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57204.log 2026-03-23T18:34:58.241 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53740.log.gz 2026-03-23T18:34:58.241 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60702.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60702.log.gz 2026-03-23T18:34:58.242 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71329.log 2026-03-23T18:34:58.242 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.57204.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56613.log 2026-03-23T18:34:58.242 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57204.log.gz 2026-03-23T18:34:58.242 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71329.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71329.log.gz 2026-03-23T18:34:58.242 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50482.log 2026-03-23T18:34:58.242 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56613.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91671.log 2026-03-23T18:34:58.242 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56613.log.gz 2026-03-23T18:34:58.243 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50482.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84809.log 2026-03-23T18:34:58.243 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50482.log.gz 2026-03-23T18:34:58.243 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91671.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91671.log.gz 2026-03-23T18:34:58.243 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34573.log 2026-03-23T18:34:58.243 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84809.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79520.log 2026-03-23T18:34:58.243 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84809.log.gz 2026-03-23T18:34:58.244 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34573.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63551.log 2026-03-23T18:34:58.244 INFO:teuthology.orchestra.run.vm04.stderr: 25.8% -- replaced with /var/log/ceph/ceph-client.admin.34573.log.gz 2026-03-23T18:34:58.244 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74158.log 2026-03-23T18:34:58.244 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79520.log: /var/log/ceph/ceph-client.admin.63551.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79520.log.gz 2026-03-23T18:34:58.244 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63551.log.gz 2026-03-23T18:34:58.244 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37703.log 2026-03-23T18:34:58.245 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74158.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29032.log 2026-03-23T18:34:58.245 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74158.log.gz 2026-03-23T18:34:58.245 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37703.log: 26.8% -- replaced with /var/log/ceph/ceph-client.admin.37703.log.gz 2026-03-23T18:34:58.245 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29848.log 2026-03-23T18:34:58.245 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29032.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71953.log 2026-03-23T18:34:58.245 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29032.log.gz 2026-03-23T18:34:58.246 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29848.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29848.log.gz 2026-03-23T18:34:58.246 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88842.log 2026-03-23T18:34:58.246 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71953.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41528.log 2026-03-23T18:34:58.246 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71953.log.gz 2026-03-23T18:34:58.246 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88842.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88842.log.gz 2026-03-23T18:34:58.247 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64278.log 2026-03-23T18:34:58.247 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.41528.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71479.log 2026-03-23T18:34:58.247 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41528.log.gz 2026-03-23T18:34:58.247 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64278.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64278.log.gz 2026-03-23T18:34:58.247 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45129.log 2026-03-23T18:34:58.247 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71479.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68909.log 2026-03-23T18:34:58.247 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71479.log.gz 2026-03-23T18:34:58.247 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.45129.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45129.log.gz 2026-03-23T18:34:58.248 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71522.log 2026-03-23T18:34:58.248 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68909.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72979.log 2026-03-23T18:34:58.248 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68909.log.gz 2026-03-23T18:34:58.248 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71522.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71522.log.gz 2026-03-23T18:34:58.248 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50986.log 2026-03-23T18:34:58.249 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72979.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50654.log 2026-03-23T18:34:58.249 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72979.log.gz 2026-03-23T18:34:58.249 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50986.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50986.log.gz 2026-03-23T18:34:58.249 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49772.log 2026-03-23T18:34:58.249 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50654.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84009.log 2026-03-23T18:34:58.249 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50654.log.gz 2026-03-23T18:34:58.249 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49772.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49772.log.gz 2026-03-23T18:34:58.250 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81675.log 2026-03-23T18:34:58.250 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84009.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49449.log 2026-03-23T18:34:58.250 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84009.log.gz 2026-03-23T18:34:58.250 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81675.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81675.log.gz 2026-03-23T18:34:58.251 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50332.log 2026-03-23T18:34:58.251 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49449.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81137.log 2026-03-23T18:34:58.251 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50332.log: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.49449.log.gz 2026-03-23T18:34:58.251 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50332.log.gz 2026-03-23T18:34:58.251 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49966.log 2026-03-23T18:34:58.251 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81137.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89422.log 2026-03-23T18:34:58.252 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81137.log.gz 2026-03-23T18:34:58.252 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49966.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49966.log.gz 2026-03-23T18:34:58.252 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48736.log 2026-03-23T18:34:58.252 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89422.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91463.log 2026-03-23T18:34:58.252 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89422.log.gz 2026-03-23T18:34:58.252 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.48736.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48736.log.gz 2026-03-23T18:34:58.253 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85590.log 2026-03-23T18:34:58.253 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91463.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66425.log 2026-03-23T18:34:58.253 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91463.log.gz 2026-03-23T18:34:58.253 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85590.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85590.log.gz 2026-03-23T18:34:58.253 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43750.log 2026-03-23T18:34:58.253 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66425.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79194.log 2026-03-23T18:34:58.253 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66425.log.gz 2026-03-23T18:34:58.254 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43750.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34321.log 2026-03-23T18:34:58.254 INFO:teuthology.orchestra.run.vm04.stderr: 25.7% -- replaced with /var/log/ceph/ceph-client.admin.43750.log.gz 2026-03-23T18:34:58.254 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44088.log 2026-03-23T18:34:58.254 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79194.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79194.log.gz 2026-03-23T18:34:58.254 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34321.log: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.34321.log.gz 2026-03-23T18:34:58.255 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26870.log 2026-03-23T18:34:58.255 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44088.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84095.log 2026-03-23T18:34:58.255 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.26870.log: 0.0% 26.0% -- replaced with /var/log/ceph/ceph-client.admin.26870.log.gz 2026-03-23T18:34:58.255 INFO:teuthology.orchestra.run.vm04.stderr: -- replaced with /var/log/ceph/ceph-client.admin.44088.log.gz 2026-03-23T18:34:58.255 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29097.log 2026-03-23T18:34:58.255 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84095.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31658.log 2026-03-23T18:34:58.255 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84095.log.gz 2026-03-23T18:34:58.256 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29097.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29097.log.gz 2026-03-23T18:34:58.256 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40837.log 2026-03-23T18:34:58.256 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31658.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31658.log.gz 2026-03-23T18:34:58.256 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59762.log 2026-03-23T18:34:58.256 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.40837.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76239.log 2026-03-23T18:34:58.256 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40837.log.gz 2026-03-23T18:34:58.257 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59762.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59762.log.gz 2026-03-23T18:34:58.257 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87896.log 2026-03-23T18:34:58.257 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76239.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48902.log 2026-03-23T18:34:58.257 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76239.log.gz 2026-03-23T18:34:58.257 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87896.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87896.log.gz 2026-03-23T18:34:58.257 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31336.log 2026-03-23T18:34:58.257 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.48902.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59480.log 2026-03-23T18:34:58.257 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48902.log.gz 2026-03-23T18:34:58.257 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31336.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31336.log.gz 2026-03-23T18:34:58.257 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85183.log 2026-03-23T18:34:58.258 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59480.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46706.log 2026-03-23T18:34:58.258 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59480.log.gz 2026-03-23T18:34:58.258 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85183.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82742.log 2026-03-23T18:34:58.258 INFO:teuthology.orchestra.run.vm04.stderr: 90.8% -- replaced with /var/log/ceph/ceph-client.admin.85183.log.gz 2026-03-23T18:34:58.258 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46706.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46706.log.gz 2026-03-23T18:34:58.259 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70151.log 2026-03-23T18:34:58.259 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82742.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59920.log 2026-03-23T18:34:58.259 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82742.log.gz 2026-03-23T18:34:58.259 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70151.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70151.log.gz 2026-03-23T18:34:58.259 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80275.log 2026-03-23T18:34:58.259 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59920.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89594.log 2026-03-23T18:34:58.259 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59920.log.gz 2026-03-23T18:34:58.260 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80275.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80275.log.gz 2026-03-23T18:34:58.260 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85891.log 2026-03-23T18:34:58.260 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89594.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27418.log 2026-03-23T18:34:58.260 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89594.log.gz 2026-03-23T18:34:58.260 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85891.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85891.log.gz 2026-03-23T18:34:58.261 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47977.log 2026-03-23T18:34:58.261 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27418.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65497.log 2026-03-23T18:34:58.261 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27418.log.gz 2026-03-23T18:34:58.261 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.47977.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47977.log.gz 2026-03-23T18:34:58.261 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47184.log 2026-03-23T18:34:58.261 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65497.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73913.log 2026-03-23T18:34:58.261 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65497.log.gz 2026-03-23T18:34:58.262 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.47184.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47184.log.gz 2026-03-23T18:34:58.262 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89401.log 2026-03-23T18:34:58.262 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73913.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80036.log 2026-03-23T18:34:58.262 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73913.log.gz 2026-03-23T18:34:58.262 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89401.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89401.log.gz 2026-03-23T18:34:58.263 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69189.log 2026-03-23T18:34:58.263 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80036.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81223.log 2026-03-23T18:34:58.263 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80036.log.gz 2026-03-23T18:34:58.263 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69189.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69189.log.gz 2026-03-23T18:34:58.263 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33572.log 2026-03-23T18:34:58.264 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81223.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65557.log 2026-03-23T18:34:58.264 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81223.log.gz 2026-03-23T18:34:58.264 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33572.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33572.log.gz 2026-03-23T18:34:58.264 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27263.log 2026-03-23T18:34:58.264 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65557.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75515.log 2026-03-23T18:34:58.264 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65557.log.gz 2026-03-23T18:34:58.264 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27263.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27263.log.gz 2026-03-23T18:34:58.264 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38526.log 2026-03-23T18:34:58.265 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75515.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36889.log 2026-03-23T18:34:58.265 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75515.log.gz 2026-03-23T18:34:58.265 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.38526.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37433.log 2026-03-23T18:34:58.265 INFO:teuthology.orchestra.run.vm04.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.38526.log.gz 2026-03-23T18:34:58.265 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36889.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36889.log.gz 2026-03-23T18:34:58.265 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36073.log 2026-03-23T18:34:58.266 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37433.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77957.log 2026-03-23T18:34:58.266 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37433.log.gz 2026-03-23T18:34:58.266 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36073.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36073.log.gz 2026-03-23T18:34:58.266 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83557.log 2026-03-23T18:34:58.266 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77957.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28650.log 2026-03-23T18:34:58.266 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77957.log.gz 2026-03-23T18:34:58.266 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83557.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83557.log.gz 2026-03-23T18:34:58.266 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50439.log 2026-03-23T18:34:58.267 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.28650.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37871.log 2026-03-23T18:34:58.267 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28650.log.gz 2026-03-23T18:34:58.267 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50439.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50439.log.gz 2026-03-23T18:34:58.267 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71436.log 2026-03-23T18:34:58.267 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37871.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59843.log 2026-03-23T18:34:58.267 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37871.log.gz 2026-03-23T18:34:58.267 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71436.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71436.log.gz 2026-03-23T18:34:58.267 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50633.log 2026-03-23T18:34:58.268 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59843.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60139.log 2026-03-23T18:34:58.268 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59843.log.gz 2026-03-23T18:34:58.268 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50633.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50633.log.gz 2026-03-23T18:34:58.268 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54041.log 2026-03-23T18:34:58.268 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60139.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27570.log 2026-03-23T18:34:58.268 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60139.log.gz 2026-03-23T18:34:58.268 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.54041.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54041.log.gz 2026-03-23T18:34:58.269 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70839.log 2026-03-23T18:34:58.269 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27570.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74847.log 2026-03-23T18:34:58.269 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27570.log.gz 2026-03-23T18:34:58.269 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70839.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70839.log.gz 2026-03-23T18:34:58.269 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51347.log 2026-03-23T18:34:58.269 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74847.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71863.log 2026-03-23T18:34:58.269 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74847.log.gz 2026-03-23T18:34:58.269 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.51347.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51347.log.gz 2026-03-23T18:34:58.270 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89702.log 2026-03-23T18:34:58.270 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71863.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51906.log 2026-03-23T18:34:58.270 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71863.log.gz 2026-03-23T18:34:58.270 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89702.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89702.log.gz 2026-03-23T18:34:58.270 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67420.log 2026-03-23T18:34:58.271 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.51906.log: gzip 0.0% -5 --verbose -- replaced with /var/log/ceph/ceph-client.admin.51906.log.gz -- 2026-03-23T18:34:58.271 INFO:teuthology.orchestra.run.vm04.stderr: /var/log/ceph/ceph-client.admin.61036.log 2026-03-23T18:34:58.271 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67420.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67420.log.gz 2026-03-23T18:34:58.271 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.22838.log 2026-03-23T18:34:58.271 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61036.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61036.log.gz 2026-03-23T18:34:58.271 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88949.log 2026-03-23T18:34:58.272 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.22838.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62735.log 2026-03-23T18:34:58.272 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.22838.log.gz 2026-03-23T18:34:58.272 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88949.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88949.log.gz 2026-03-23T18:34:58.272 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59707.log 2026-03-23T18:34:58.272 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62735.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73237.log 2026-03-23T18:34:58.272 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62735.log.gz 2026-03-23T18:34:58.272 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59707.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59707.log.gz 2026-03-23T18:34:58.272 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59609.log 2026-03-23T18:34:58.273 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73237.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68501.log 2026-03-23T18:34:58.273 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73237.log.gz 2026-03-23T18:34:58.273 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59609.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59609.log.gz 2026-03-23T18:34:58.273 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72582.log 2026-03-23T18:34:58.273 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68501.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59631.log 2026-03-23T18:34:58.273 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68501.log.gz 2026-03-23T18:34:58.273 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72582.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72582.log.gz 2026-03-23T18:34:58.273 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85741.log 2026-03-23T18:34:58.274 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59631.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66704.log 2026-03-23T18:34:58.274 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59631.log.gz 2026-03-23T18:34:58.274 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85741.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85741.log.gz 2026-03-23T18:34:58.274 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48276.log 2026-03-23T18:34:58.274 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66704.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30947.log 2026-03-23T18:34:58.274 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66704.log.gz 2026-03-23T18:34:58.274 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.48276.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48276.log.gz 2026-03-23T18:34:58.275 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84616.log 2026-03-23T18:34:58.275 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.30947.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60497.log 2026-03-23T18:34:58.275 INFO:teuthology.orchestra.run.vm04.stderr: 27.1% -- replaced with /var/log/ceph/ceph-client.admin.30947.log.gz 2026-03-23T18:34:58.275 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84616.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84616.log.gz 2026-03-23T18:34:58.275 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65301.log 2026-03-23T18:34:58.275 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60497.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69436.log 2026-03-23T18:34:58.275 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60497.log.gz 2026-03-23T18:34:58.276 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65301.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65301.log.gz 2026-03-23T18:34:58.276 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59502.log 2026-03-23T18:34:58.276 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69436.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37416.log 2026-03-23T18:34:58.276 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69436.log.gz 2026-03-23T18:34:58.276 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59502.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59502.log.gz 2026-03-23T18:34:58.276 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79539.log 2026-03-23T18:34:58.276 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37416.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84266.log 2026-03-23T18:34:58.277 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37416.log.gz 2026-03-23T18:34:58.277 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79539.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79539.log.gz 2026-03-23T18:34:58.277 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37059.log 2026-03-23T18:34:58.277 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54685.log 2026-03-23T18:34:58.277 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84266.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84266.log.gz 2026-03-23T18:34:58.277 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37059.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37059.log.gz 2026-03-23T18:34:58.278 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42926.log 2026-03-23T18:34:58.278 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.54685.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72395.log 2026-03-23T18:34:58.278 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54685.log.gz 2026-03-23T18:34:58.278 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42926.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42926.log.gz 2026-03-23T18:34:58.278 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91489.log 2026-03-23T18:34:58.278 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72395.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37661.log 2026-03-23T18:34:58.278 INFO:teuthology.orchestra.run.vm04.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.72395.log.gz 2026-03-23T18:34:58.279 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91489.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91489.log.gz 2026-03-23T18:34:58.279 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86431.log 2026-03-23T18:34:58.279 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37661.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42243.log 2026-03-23T18:34:58.279 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37661.log.gz 2026-03-23T18:34:58.279 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86431.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86431.log.gz 2026-03-23T18:34:58.279 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41038.log 2026-03-23T18:34:58.279 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42243.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55515.log 2026-03-23T18:34:58.280 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.41038.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76282.log 2026-03-23T18:34:58.280 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41038.log.gz 2026-03-23T18:34:58.280 INFO:teuthology.orchestra.run.vm04.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.42243.log.gz 2026-03-23T18:34:58.280 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82763.log 2026-03-23T18:34:58.280 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.55515.log: /var/log/ceph/ceph-client.admin.76282.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76282.log.gz 2026-03-23T18:34:58.280 INFO:teuthology.orchestra.run.vm04.stderr: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.55515.log.gz 2026-03-23T18:34:58.280 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78577.log 2026-03-23T18:34:58.281 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82763.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56376.log 2026-03-23T18:34:58.281 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82763.log.gz 2026-03-23T18:34:58.281 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78577.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78577.log.gz 2026-03-23T18:34:58.281 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58834.log 2026-03-23T18:34:58.281 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56376.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31271.log 2026-03-23T18:34:58.281 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56376.log.gz 2026-03-23T18:34:58.281 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58834.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58834.log.gz 2026-03-23T18:34:58.281 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37348.log 2026-03-23T18:34:58.282 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31271.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43989.log 2026-03-23T18:34:58.282 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31271.log.gz 2026-03-23T18:34:58.282 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37348.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37348.log.gz 2026-03-23T18:34:58.282 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60102.log 2026-03-23T18:34:58.283 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43989.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43989.log.gz 2026-03-23T18:34:58.283 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43692.log 2026-03-23T18:34:58.283 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60102.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83180.log 2026-03-23T18:34:58.283 INFO:teuthology.orchestra.run.vm04.stderr: 29.3% -- replaced with /var/log/ceph/ceph-client.admin.60102.log.gz 2026-03-23T18:34:58.283 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43692.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26440.log 2026-03-23T18:34:58.283 INFO:teuthology.orchestra.run.vm04.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.43692.log.gz 2026-03-23T18:34:58.283 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83180.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83180.log.gz 2026-03-23T18:34:58.284 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82131.log 2026-03-23T18:34:58.284 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.26440.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43911.log 2026-03-23T18:34:58.284 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26440.log.gz 2026-03-23T18:34:58.284 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82131.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82131.log.gz 2026-03-23T18:34:58.284 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49004.log 2026-03-23T18:34:58.284 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43911.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35784.log 2026-03-23T18:34:58.284 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43911.log.gz 2026-03-23T18:34:58.284 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49004.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49004.log.gz 2026-03-23T18:34:58.285 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74890.log 2026-03-23T18:34:58.285 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.35784.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32047.log 2026-03-23T18:34:58.285 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35784.log.gz 2026-03-23T18:34:58.285 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74890.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74890.log.gz 2026-03-23T18:34:58.285 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88906.log 2026-03-23T18:34:58.286 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32047.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27778.log 2026-03-23T18:34:58.286 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32047.log.gz 2026-03-23T18:34:58.286 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88906.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88906.log.gz 2026-03-23T18:34:58.286 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56591.log 2026-03-23T18:34:58.286 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27778.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49148.log 2026-03-23T18:34:58.286 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27778.log.gz 2026-03-23T18:34:58.286 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56591.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56591.log.gz 2026-03-23T18:34:58.286 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87334.log 2026-03-23T18:34:58.287 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49148.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56677.log 2026-03-23T18:34:58.287 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87334.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87334.log.gz 2026-03-23T18:34:58.287 INFO:teuthology.orchestra.run.vm04.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.49148.log.gz 2026-03-23T18:34:58.287 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27474.log 2026-03-23T18:34:58.287 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56677.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82591.log 2026-03-23T18:34:58.287 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56677.log.gz 2026-03-23T18:34:58.287 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27474.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27474.log.gz 2026-03-23T18:34:58.287 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79011.log 2026-03-23T18:34:58.288 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82591.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82591.log.gz 2026-03-23T18:34:58.288 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36957.log 2026-03-23T18:34:58.288 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79011.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32977.log 2026-03-23T18:34:58.288 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79011.log.gz 2026-03-23T18:34:58.288 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36957.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36957.log.gz 2026-03-23T18:34:58.288 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88574.log 2026-03-23T18:34:58.289 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68286.log 2026-03-23T18:34:58.289 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32977.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32977.log.gz 2026-03-23T18:34:58.289 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88574.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88574.log.gz 2026-03-23T18:34:58.289 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34300.log 2026-03-23T18:34:58.289 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68286.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71951.log 2026-03-23T18:34:58.289 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68286.log.gz 2026-03-23T18:34:58.289 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34300.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60085.log 2026-03-23T18:34:58.290 INFO:teuthology.orchestra.run.vm04.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.34300.log.gz 2026-03-23T18:34:58.290 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71951.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71951.log.gz 2026-03-23T18:34:58.290 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89960.log 2026-03-23T18:34:58.290 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60085.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43771.log 2026-03-23T18:34:58.290 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60085.log.gz 2026-03-23T18:34:58.291 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89960.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89960.log.gz 2026-03-23T18:34:58.291 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68587.log 2026-03-23T18:34:58.291 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43771.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43771.log.gz 2026-03-23T18:34:58.291 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49234.log 2026-03-23T18:34:58.291 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68587.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82634.log 2026-03-23T18:34:58.291 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68587.log.gz 2026-03-23T18:34:58.292 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49234.log: 25.7% -- replaced with /var/log/ceph/ceph-client.admin.49234.log.gz 2026-03-23T18:34:58.292 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42078.log 2026-03-23T18:34:58.292 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82634.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26290.log 2026-03-23T18:34:58.292 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82634.log.gz 2026-03-23T18:34:58.292 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42078.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42078.log.gz 2026-03-23T18:34:58.293 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31059.log 2026-03-23T18:34:58.293 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.26290.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78559.log 2026-03-23T18:34:58.293 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26290.log.gz 2026-03-23T18:34:58.293 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31059.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31059.log.gz 2026-03-23T18:34:58.293 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31572.log 2026-03-23T18:34:58.294 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78559.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89078.log 2026-03-23T18:34:58.294 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78559.log.gz 2026-03-23T18:34:58.294 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31572.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31572.log.gz 2026-03-23T18:34:58.294 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42844.log 2026-03-23T18:34:58.294 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89078.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65362.log 2026-03-23T18:34:58.294 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89078.log.gz 2026-03-23T18:34:58.295 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42844.log: 29.4% -- replaced with /var/log/ceph/ceph-client.admin.42844.log.gz 2026-03-23T18:34:58.295 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43423.log 2026-03-23T18:34:58.295 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65362.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32807.log 2026-03-23T18:34:58.295 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65362.log.gz 2026-03-23T18:34:58.295 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43423.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43423.log.gz 2026-03-23T18:34:58.295 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58549.log 2026-03-23T18:34:58.296 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32807.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58443.log 2026-03-23T18:34:58.296 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32807.log.gz 2026-03-23T18:34:58.296 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58549.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58549.log.gz 2026-03-23T18:34:58.296 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26913.log 2026-03-23T18:34:58.296 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58443.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71897.log 2026-03-23T18:34:58.296 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58443.log.gz 2026-03-23T18:34:58.297 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.26913.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26913.log.gz 2026-03-23T18:34:58.297 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57864.log 2026-03-23T18:34:58.297 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71897.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64815.log 2026-03-23T18:34:58.297 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71897.log.gz 2026-03-23T18:34:58.297 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.57864.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57864.log.gz 2026-03-23T18:34:58.297 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32132.log 2026-03-23T18:34:58.297 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64815.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83600.log 2026-03-23T18:34:58.297 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64815.log.gz 2026-03-23T18:34:58.298 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32132.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32132.log.gz 2026-03-23T18:34:58.298 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81589.log 2026-03-23T18:34:58.298 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83600.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54955.log 2026-03-23T18:34:58.298 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83600.log.gz 2026-03-23T18:34:58.298 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81589.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81589.log.gz 2026-03-23T18:34:58.298 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56656.log 2026-03-23T18:34:58.299 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58122.log 2026-03-23T18:34:58.299 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.54955.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54955.log.gz 2026-03-23T18:34:58.299 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56656.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56656.log.gz 2026-03-23T18:34:58.299 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53568.log 2026-03-23T18:34:58.299 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58122.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43524.log 2026-03-23T18:34:58.299 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58122.log.gz 2026-03-23T18:34:58.299 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.53568.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53568.log.gz 2026-03-23T18:34:58.299 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81828.log 2026-03-23T18:34:58.300 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43524.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83880.log 2026-03-23T18:34:58.300 INFO:teuthology.orchestra.run.vm04.stderr: 56.7% -- replaced with /var/log/ceph/ceph-client.admin.43524.log.gz 2026-03-23T18:34:58.300 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81828.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81828.log.gz 2026-03-23T18:34:58.300 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87273.log 2026-03-23T18:34:58.300 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83880.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46670.log 2026-03-23T18:34:58.300 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83880.log.gz 2026-03-23T18:34:58.300 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87273.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87273.log.gz 2026-03-23T18:34:58.301 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80404.log 2026-03-23T18:34:58.301 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46670.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45151.log 2026-03-23T18:34:58.301 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46670.log.gz 2026-03-23T18:34:58.301 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80404.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80404.log.gz 2026-03-23T18:34:58.301 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64774.log 2026-03-23T18:34:58.302 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.45151.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80447.log 2026-03-23T18:34:58.302 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45151.log.gz 2026-03-23T18:34:58.302 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64774.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46776.log 2026-03-23T18:34:58.302 INFO:teuthology.orchestra.run.vm04.stderr: 53.0% -- replaced with /var/log/ceph/ceph-client.admin.64774.log.gz 2026-03-23T18:34:58.302 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80447.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80447.log.gz 2026-03-23T18:34:58.302 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90261.log 2026-03-23T18:34:58.303 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46776.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40618.log 2026-03-23T18:34:58.303 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46776.log.gz 2026-03-23T18:34:58.303 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90261.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90261.log.gz 2026-03-23T18:34:58.303 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58324.log 2026-03-23T18:34:58.303 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.40618.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87681.log 2026-03-23T18:34:58.303 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40618.log.gz 2026-03-23T18:34:58.303 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58324.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58324.log.gz 2026-03-23T18:34:58.303 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45215.log 2026-03-23T18:34:58.304 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87681.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50095.log 2026-03-23T18:34:58.304 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87681.log.gz 2026-03-23T18:34:58.304 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.45215.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45215.log.gz 2026-03-23T18:34:58.304 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33538.log 2026-03-23T18:34:58.304 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50095.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34111.log 2026-03-23T18:34:58.304 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50095.log.gz 2026-03-23T18:34:58.305 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33538.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33538.log.gz 2026-03-23T18:34:58.305 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27834.log 2026-03-23T18:34:58.305 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34111.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32218.log 2026-03-23T18:34:58.305 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34111.log.gz 2026-03-23T18:34:58.305 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27834.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27834.log.gz 2026-03-23T18:34:58.305 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26827.log 2026-03-23T18:34:58.306 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32218.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58690.log 2026-03-23T18:34:58.306 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32218.log.gz 2026-03-23T18:34:58.306 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.26827.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26827.log.gz 2026-03-23T18:34:58.306 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33754.log 2026-03-23T18:34:58.306 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58690.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37025.log 2026-03-23T18:34:58.306 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58690.log.gz 2026-03-23T18:34:58.307 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33754.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39486.log 2026-03-23T18:34:58.307 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37025.log: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.33754.log.gz 2026-03-23T18:34:58.307 INFO:teuthology.orchestra.run.vm04.stderr: 29.0% -- replaced with /var/log/ceph/ceph-client.admin.37025.log.gz 2026-03-23T18:34:58.307 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51433.log 2026-03-23T18:34:58.307 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.39486.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46652.log 2026-03-23T18:34:58.307 INFO:teuthology.orchestra.run.vm04.stderr: 25.5% -- replaced with /var/log/ceph/ceph-client.admin.39486.log.gz 2026-03-23T18:34:58.307 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.51433.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51433.log.gz 2026-03-23T18:34:58.308 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71694.log 2026-03-23T18:34:58.308 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46652.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42784.log 2026-03-23T18:34:58.308 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46652.log.gz 2026-03-23T18:34:58.308 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71694.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71694.log.gz 2026-03-23T18:34:58.308 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75981.log 2026-03-23T18:34:58.308 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42784.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65476.log 2026-03-23T18:34:58.308 INFO:teuthology.orchestra.run.vm04.stderr: 15.1% -- replaced with /var/log/ceph/ceph-client.admin.42784.log.gz 2026-03-23T18:34:58.309 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75981.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75981.log.gz 2026-03-23T18:34:58.309 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83686.log 2026-03-23T18:34:58.309 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65476.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56441.log 2026-03-23T18:34:58.309 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65476.log.gz 2026-03-23T18:34:58.309 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83686.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83686.log.gz 2026-03-23T18:34:58.310 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40757.log 2026-03-23T18:34:58.310 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56441.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77187.log 2026-03-23T18:34:58.310 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56441.log.gz 2026-03-23T18:34:58.310 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.40757.log: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.40757.log.gz 2026-03-23T18:34:58.311 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40636.log 2026-03-23T18:34:58.311 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77187.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34090.log 2026-03-23T18:34:58.311 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77187.log.gz 2026-03-23T18:34:58.311 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.40636.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40636.log.gz 2026-03-23T18:34:58.311 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45086.log 2026-03-23T18:34:58.311 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34090.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91437.log 2026-03-23T18:34:58.311 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.45086.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45086.log.gz 2026-03-23T18:34:58.311 INFO:teuthology.orchestra.run.vm04.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.34090.log.gz 2026-03-23T18:34:58.312 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38946.log 2026-03-23T18:34:58.312 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91437.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83772.log 2026-03-23T18:34:58.312 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91437.log.gz 2026-03-23T18:34:58.312 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.38946.log: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.38946.log.gz 2026-03-23T18:34:58.312 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50138.log 2026-03-23T18:34:58.313 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83772.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82441.log 2026-03-23T18:34:58.313 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83772.log.gz 2026-03-23T18:34:58.313 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50138.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50138.log.gz 2026-03-23T18:34:58.313 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50525.log 2026-03-23T18:34:58.313 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82441.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33130.log 2026-03-23T18:34:58.313 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82441.log.gz 2026-03-23T18:34:58.314 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50525.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50525.log.gz 2026-03-23T18:34:58.314 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73723.log 2026-03-23T18:34:58.314 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33130.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50418.log 2026-03-23T18:34:58.314 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33130.log.gz 2026-03-23T18:34:58.314 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73723.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73723.log.gz 2026-03-23T18:34:58.315 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30886.log 2026-03-23T18:34:58.315 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50418.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48820.log 2026-03-23T18:34:58.315 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50418.log.gz 2026-03-23T18:34:58.315 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.30886.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61122.log 2026-03-23T18:34:58.315 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30886.log.gz 2026-03-23T18:34:58.315 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.48820.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48820.log.gz 2026-03-23T18:34:58.315 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49492.log 2026-03-23T18:34:58.316 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61122.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48172.log 2026-03-23T18:34:58.316 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61122.log.gz 2026-03-23T18:34:58.316 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49492.log: 27.1% -- replaced with /var/log/ceph/ceph-client.admin.49492.log.gz 2026-03-23T18:34:58.316 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35728.log 2026-03-23T18:34:58.316 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.48172.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58921.log 2026-03-23T18:34:58.316 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48172.log.gz 2026-03-23T18:34:58.317 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.35728.log: 25.7% -- replaced with /var/log/ceph/ceph-client.admin.35728.log.gz 2026-03-23T18:34:58.317 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79501.log 2026-03-23T18:34:58.317 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58921.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90372.log 2026-03-23T18:34:58.317 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58921.log.gz 2026-03-23T18:34:58.317 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79501.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79501.log.gz 2026-03-23T18:34:58.318 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47356.log 2026-03-23T18:34:58.318 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90372.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42343.log 2026-03-23T18:34:58.318 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90372.log.gz 2026-03-23T18:34:58.318 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.47356.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47356.log.gz 2026-03-23T18:34:58.318 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29805.log 2026-03-23T18:34:58.319 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42343.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29934.log 2026-03-23T18:34:58.319 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42343.log.gz 2026-03-23T18:34:58.319 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29805.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29805.log.gz 2026-03-23T18:34:58.319 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26569.log 2026-03-23T18:34:58.319 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29934.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29934.log.gz 2026-03-23T18:34:58.319 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29072.log 2026-03-23T18:34:58.320 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.26569.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88502.log 2026-03-23T18:34:58.320 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26569.log.gz 2026-03-23T18:34:58.320 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29072.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29072.log.gz 2026-03-23T18:34:58.320 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56806.log 2026-03-23T18:34:58.320 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88502.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45624.log 2026-03-23T18:34:58.320 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88502.log.gz 2026-03-23T18:34:58.321 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56806.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56806.log.gz 2026-03-23T18:34:58.321 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47119.log 2026-03-23T18:34:58.321 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.45624.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29012.log 2026-03-23T18:34:58.321 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45624.log.gz 2026-03-23T18:34:58.321 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.47119.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47119.log.gz 2026-03-23T18:34:58.321 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72256.log 2026-03-23T18:34:58.322 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29012.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36651.log 2026-03-23T18:34:58.322 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29012.log.gz 2026-03-23T18:34:58.322 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72256.log: 12.0% -- replaced with /var/log/ceph/ceph-client.admin.72256.log.gz 2026-03-23T18:34:58.322 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70904.log 2026-03-23T18:34:58.322 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36651.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36651.log.gz 2026-03-23T18:34:58.323 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47635.log 2026-03-23T18:34:58.323 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70904.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86410.log 2026-03-23T18:34:58.323 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70904.log.gz 2026-03-23T18:34:58.323 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.47635.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47635.log.gz 2026-03-23T18:34:58.323 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40919.log 2026-03-23T18:34:58.324 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86410.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77871.log 2026-03-23T18:34:58.324 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86410.log.gz 2026-03-23T18:34:58.324 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.40919.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40919.log.gz 2026-03-23T18:34:58.324 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75585.log 2026-03-23T18:34:58.324 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77871.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61789.log 2026-03-23T18:34:58.324 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77871.log.gz 2026-03-23T18:34:58.325 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75585.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33283.log 2026-03-23T18:34:58.325 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75585.log.gz 2026-03-23T18:34:58.325 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61789.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61789.log.gz 2026-03-23T18:34:58.325 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69909.log 2026-03-23T18:34:58.325 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33283.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44030.log 2026-03-23T18:34:58.325 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33283.log.gz 2026-03-23T18:34:58.325 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69909.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36940.log 2026-03-23T18:34:58.326 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69909.log.gz 2026-03-23T18:34:58.326 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44030.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.44030.log.gz 2026-03-23T18:34:58.326 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27702.log 2026-03-23T18:34:58.326 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36940.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32841.log 2026-03-23T18:34:58.326 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27702.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27702.log.gz 2026-03-23T18:34:58.327 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36940.log.gz 2026-03-23T18:34:58.327 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40445.log 2026-03-23T18:34:58.327 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32841.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75895.log 2026-03-23T18:34:58.327 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32841.log.gz 2026-03-23T18:34:58.327 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.40445.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40445.log.gz 2026-03-23T18:34:58.327 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49213.log 2026-03-23T18:34:58.328 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75895.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75895.log.gz 2026-03-23T18:34:58.328 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75744.log 2026-03-23T18:34:58.328 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49213.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63451.log 2026-03-23T18:34:58.328 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49213.log.gz 2026-03-23T18:34:58.328 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75744.log: gzip 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75744.log.gz -5 2026-03-23T18:34:58.328 INFO:teuthology.orchestra.run.vm04.stderr: --verbose -- /var/log/ceph/ceph-client.admin.61101.log 2026-03-23T18:34:58.328 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63451.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63451.log.gz 2026-03-23T18:34:58.329 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62047.log 2026-03-23T18:34:58.329 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61101.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31637.log 2026-03-23T18:34:58.329 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61101.log.gz 2026-03-23T18:34:58.329 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62047.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62047.log.gz 2026-03-23T18:34:58.329 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66769.log 2026-03-23T18:34:58.329 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31637.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31637.log.gz 2026-03-23T18:34:58.330 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30429.log 2026-03-23T18:34:58.330 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66769.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38925.log 2026-03-23T18:34:58.330 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66769.log.gz 2026-03-23T18:34:58.330 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.30429.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30429.log.gz 2026-03-23T18:34:58.330 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41507.log 2026-03-23T18:34:58.330 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.38925.log: 27.5% -- replaced with /var/log/ceph/ceph-client.admin.38925.log.gz 2026-03-23T18:34:58.331 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44936.log 2026-03-23T18:34:58.331 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.41507.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72494.log 2026-03-23T18:34:58.331 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41507.log.gz 2026-03-23T18:34:58.331 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44936.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44936.log.gz 2026-03-23T18:34:58.331 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62794.log 2026-03-23T18:34:58.331 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72494.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72494.log.gz 2026-03-23T18:34:58.332 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55451.log 2026-03-23T18:34:58.332 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62794.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32875.log 2026-03-23T18:34:58.332 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62794.log.gz 2026-03-23T18:34:58.332 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.55451.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55451.log.gz 2026-03-23T18:34:58.332 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84181.log 2026-03-23T18:34:58.332 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32875.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65787.log 2026-03-23T18:34:58.333 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32875.log.gz 2026-03-23T18:34:58.333 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84181.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84181.log.gz 2026-03-23T18:34:58.333 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72636.log 2026-03-23T18:34:58.333 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65787.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32994.log 2026-03-23T18:34:58.333 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65787.log.gz 2026-03-23T18:34:58.333 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72636.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72636.log.gz 2026-03-23T18:34:58.334 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71243.log 2026-03-23T18:34:58.334 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32994.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32405.log 2026-03-23T18:34:58.334 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32994.log.gz 2026-03-23T18:34:58.334 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71243.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71243.log.gz 2026-03-23T18:34:58.334 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69563.log 2026-03-23T18:34:58.335 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32405.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66962.log 2026-03-23T18:34:58.335 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32405.log.gz 2026-03-23T18:34:58.335 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69563.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.69563.log.gz -5 2026-03-23T18:34:58.335 INFO:teuthology.orchestra.run.vm04.stderr: --verbose -- /var/log/ceph/ceph-client.admin.68931.log 2026-03-23T18:34:58.335 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66962.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66962.log.gz 2026-03-23T18:34:58.335 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68630.log 2026-03-23T18:34:58.336 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68931.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31508.log 2026-03-23T18:34:58.336 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68931.log.gz 2026-03-23T18:34:58.336 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68630.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68630.log.gz 2026-03-23T18:34:58.336 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51368.log 2026-03-23T18:34:58.336 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31508.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55709.log 2026-03-23T18:34:58.336 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31508.log.gz 2026-03-23T18:34:58.337 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.51368.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51368.log.gz 2026-03-23T18:34:58.337 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63509.log 2026-03-23T18:34:58.337 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.55709.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78964.log 2026-03-23T18:34:58.337 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55709.log.gz 2026-03-23T18:34:58.337 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63509.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63509.log.gz 2026-03-23T18:34:58.337 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40386.log 2026-03-23T18:34:58.338 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78964.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90467.log 2026-03-23T18:34:58.338 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78964.log.gz 2026-03-23T18:34:58.338 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.40386.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40386.log.gz 2026-03-23T18:34:58.338 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33181.log 2026-03-23T18:34:58.338 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90467.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68071.log 2026-03-23T18:34:58.338 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90467.log.gz 2026-03-23T18:34:58.339 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33181.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33181.log.gz 2026-03-23T18:34:58.339 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71050.log 2026-03-23T18:34:58.339 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68071.log: gzip -5 --verbose 0.0% -- /var/log/ceph/ceph-client.admin.42016.log -- replaced with /var/log/ceph/ceph-client.admin.68071.log.gz 2026-03-23T18:34:58.339 INFO:teuthology.orchestra.run.vm04.stderr: 2026-03-23T18:34:58.339 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71050.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89143.log 2026-03-23T18:34:58.339 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71050.log.gz 2026-03-23T18:34:58.339 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42016.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42016.log.gz 2026-03-23T18:34:58.340 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81030.log 2026-03-23T18:34:58.340 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89143.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50052.log 2026-03-23T18:34:58.340 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89143.log.gz 2026-03-23T18:34:58.340 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81030.log: gzip 0.0% -5 --verbose -- replaced with /var/log/ceph/ceph-client.admin.81030.log.gz -- 2026-03-23T18:34:58.340 INFO:teuthology.orchestra.run.vm04.stderr: /var/log/ceph/ceph-client.admin.55472.log 2026-03-23T18:34:58.340 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50052.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50052.log.gz 2026-03-23T18:34:58.340 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64694.log 2026-03-23T18:34:58.341 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.55472.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65381.log 2026-03-23T18:34:58.341 INFO:teuthology.orchestra.run.vm04.stderr: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.55472.log.gz 2026-03-23T18:34:58.341 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64694.log: 54.6% -- replaced with /var/log/ceph/ceph-client.admin.64694.log.gz 2026-03-23T18:34:58.341 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85569.log 2026-03-23T18:34:58.341 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65381.log: 56.3% -- replaced with /var/log/ceph/ceph-client.admin.65381.log.gz 2026-03-23T18:34:58.342 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48800.log 2026-03-23T18:34:58.342 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85569.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46880.log 2026-03-23T18:34:58.342 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85569.log.gz 2026-03-23T18:34:58.342 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.48800.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48800.log.gz 2026-03-23T18:34:58.342 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61918.log 2026-03-23T18:34:58.342 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88192.log 2026-03-23T18:34:58.343 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61918.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61918.log.gz 2026-03-23T18:34:58.343 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46880.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46880.log.gz 2026-03-23T18:34:58.343 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56031.log 2026-03-23T18:34:58.343 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88192.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87488.log 2026-03-23T18:34:58.343 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88192.log.gz 2026-03-23T18:34:58.344 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56031.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56031.log.gz 2026-03-23T18:34:58.344 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33691.log 2026-03-23T18:34:58.344 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87488.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.87488.log.gz 2026-03-23T18:34:58.344 INFO:teuthology.orchestra.run.vm04.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.36923.log 2026-03-23T18:34:58.345 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33691.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57456.log 2026-03-23T18:34:58.345 INFO:teuthology.orchestra.run.vm04.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.33691.log.gz 2026-03-23T18:34:58.345 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36923.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36923.log.gz 2026-03-23T18:34:58.345 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29590.log 2026-03-23T18:34:58.345 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.57456.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26698.log 2026-03-23T18:34:58.345 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57456.log.gz 2026-03-23T18:34:58.346 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29590.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29590.log.gz 2026-03-23T18:34:58.346 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86818.log 2026-03-23T18:34:58.346 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.26698.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26698.log.gz 2026-03-23T18:34:58.346 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42865.log 2026-03-23T18:34:58.346 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86818.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90389.log 2026-03-23T18:34:58.346 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86818.log.gz 2026-03-23T18:34:58.347 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42865.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42865.log.gz 2026-03-23T18:34:58.347 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54894.log 2026-03-23T18:34:58.347 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90389.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90389.log.gz 2026-03-23T18:34:58.347 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77355.log 2026-03-23T18:34:58.347 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.54894.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25890.log 2026-03-23T18:34:58.348 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54894.log.gz 2026-03-23T18:34:58.348 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77355.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77355.log.gz 2026-03-23T18:34:58.348 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46599.log 2026-03-23T18:34:58.348 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.25890.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25890.log.gz 2026-03-23T18:34:58.348 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64354.log 2026-03-23T18:34:58.349 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46599.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43729.log 2026-03-23T18:34:58.349 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46599.log.gz 2026-03-23T18:34:58.349 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64354.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64354.log.gz 2026-03-23T18:34:58.349 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73430.log 2026-03-23T18:34:58.349 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43729.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80640.log 2026-03-23T18:34:58.349 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43729.log.gz 2026-03-23T18:34:58.349 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73430.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73430.log.gz 2026-03-23T18:34:58.350 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75168.log 2026-03-23T18:34:58.350 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80640.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80640.log.gz 2026-03-23T18:34:58.350 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77529.log 2026-03-23T18:34:58.351 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75168.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75168.log.gz 2026-03-23T18:34:58.351 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77268.log 2026-03-23T18:34:58.351 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77529.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74982.log 2026-03-23T18:34:58.351 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77529.log.gz 2026-03-23T18:34:58.351 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77268.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77268.log.gz 2026-03-23T18:34:58.352 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89917.log 2026-03-23T18:34:58.352 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74982.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74982.log.gz 2026-03-23T18:34:58.352 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29827.log 2026-03-23T18:34:58.352 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89917.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55015.log 2026-03-23T18:34:58.352 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89917.log.gz 2026-03-23T18:34:58.353 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29827.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29827.log.gz 2026-03-23T18:34:58.353 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26247.log 2026-03-23T18:34:58.353 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.55015.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55015.log.gz 2026-03-23T18:34:58.353 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49793.log 2026-03-23T18:34:58.353 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.26247.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26247.log.gz 2026-03-23T18:34:58.354 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42264.log 2026-03-23T18:34:58.354 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49793.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43444.log 2026-03-23T18:34:58.354 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49793.log.gz 2026-03-23T18:34:58.354 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42264.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42264.log.gz 2026-03-23T18:34:58.354 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85612.log 2026-03-23T18:34:58.355 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43444.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43444.log.gz 2026-03-23T18:34:58.355 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31443.log 2026-03-23T18:34:58.355 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44771.log 2026-03-23T18:34:58.355 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85612.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85612.log.gz 2026-03-23T18:34:58.355 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31443.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31443.log.gz 2026-03-23T18:34:58.356 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80167.log 2026-03-23T18:34:58.356 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44771.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44771.log.gz 2026-03-23T18:34:58.356 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61079.log 2026-03-23T18:34:58.356 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76045.log 2026-03-23T18:34:58.356 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80167.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80167.log.gz 2026-03-23T18:34:58.356 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61079.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61079.log.gz 2026-03-23T18:34:58.356 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47829.log 2026-03-23T18:34:58.357 INFO:teuthology.orchestra.run.vm04.stderr:gzip/var/log/ceph/ceph-client.admin.76045.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.56269.log 2026-03-23T18:34:58.357 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76045.log.gz 2026-03-23T18:34:58.357 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.47829.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47829.log.gz 2026-03-23T18:34:58.357 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31911.log 2026-03-23T18:34:58.358 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56269.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56269.log.gz 2026-03-23T18:34:58.358 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39849.log 2026-03-23T18:34:58.358 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31911.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31911.log.gz 2026-03-23T18:34:58.359 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58286.log 2026-03-23T18:34:58.359 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.39849.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83417.log 2026-03-23T18:34:58.359 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39849.log.gz 2026-03-23T18:34:58.359 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58286.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58286.log.gz 2026-03-23T18:34:58.359 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31379.log 2026-03-23T18:34:58.359 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83417.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83417.log.gz 2026-03-23T18:34:58.360 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45323.log 2026-03-23T18:34:58.360 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31379.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36243.log 2026-03-23T18:34:58.360 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31379.log.gz 2026-03-23T18:34:58.360 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.45323.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45323.log.gz 2026-03-23T18:34:58.361 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40700.log 2026-03-23T18:34:58.361 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36243.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36243.log.gz 2026-03-23T18:34:58.361 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48192.log 2026-03-23T18:34:58.361 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.40700.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40700.log.gz 2026-03-23T18:34:58.361 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74719.log 2026-03-23T18:34:58.362 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.48192.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63160.log 2026-03-23T18:34:58.362 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48192.log.gz 2026-03-23T18:34:58.362 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74719.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74719.log.gz 2026-03-23T18:34:58.362 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32858.log 2026-03-23T18:34:58.363 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63160.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63160.log.gz 2026-03-23T18:34:58.363 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32064.log 2026-03-23T18:34:58.363 INFO:teuthology.orchestra.run.vm04.stderr:gzip/var/log/ceph/ceph-client.admin.32858.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.36175.log 2026-03-23T18:34:58.363 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32858.log.gz 2026-03-23T18:34:58.363 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32064.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32064.log.gz 2026-03-23T18:34:58.363 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36804.log 2026-03-23T18:34:58.364 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36175.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36175.log.gz 2026-03-23T18:34:58.364 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36413.log 2026-03-23T18:34:58.364 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36804.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36804.log.gz 2026-03-23T18:34:58.364 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66640.log 2026-03-23T18:34:58.364 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36413.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79054.log 2026-03-23T18:34:58.365 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36413.log.gz 2026-03-23T18:34:58.365 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66640.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66640.log.gz 2026-03-23T18:34:58.365 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82828.log 2026-03-23T18:34:58.365 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79054.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79054.log.gz 2026-03-23T18:34:58.365 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69167.log 2026-03-23T18:34:58.366 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82828.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82828.log.gz 2026-03-23T18:34:58.366 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25966.log 2026-03-23T18:34:58.366 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69167.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50396.log 2026-03-23T18:34:58.366 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69167.log.gz 2026-03-23T18:34:58.366 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.25966.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25966.log.gz 2026-03-23T18:34:58.367 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51175.log 2026-03-23T18:34:58.367 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50396.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50396.log.gz 2026-03-23T18:34:58.367 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48776.log 2026-03-23T18:34:58.367 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.51175.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51175.log.gz 2026-03-23T18:34:58.368 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37955.log 2026-03-23T18:34:58.368 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.48776.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83944.log 2026-03-23T18:34:58.368 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48776.log.gz 2026-03-23T18:34:58.368 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37955.log: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.37955.log.gz 2026-03-23T18:34:58.368 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87158.log 2026-03-23T18:34:58.369 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83944.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83944.log.gz 2026-03-23T18:34:58.369 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28974.log 2026-03-23T18:34:58.369 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87158.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87158.log.gz 2026-03-23T18:34:58.369 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76736.log 2026-03-23T18:34:58.370 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.28974.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87402.log 2026-03-23T18:34:58.370 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28974.log.gz 2026-03-23T18:34:58.370 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76736.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76736.log.gz 2026-03-23T18:34:58.370 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31422.log 2026-03-23T18:34:58.370 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87402.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87402.log.gz 2026-03-23T18:34:58.371 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38820.log 2026-03-23T18:34:58.371 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31422.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87746.log 2026-03-23T18:34:58.371 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31422.log.gz 2026-03-23T18:34:58.371 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.38820.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67562.log 2026-03-23T18:34:58.371 INFO:teuthology.orchestra.run.vm04.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.38820.log.gz 2026-03-23T18:34:58.371 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87746.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87746.log.gz 2026-03-23T18:34:58.372 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45538.log 2026-03-23T18:34:58.372 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67562.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62835.log 2026-03-23T18:34:58.372 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67562.log.gz 2026-03-23T18:34:58.372 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.45538.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45538.log.gz 2026-03-23T18:34:58.372 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31928.log 2026-03-23T18:34:58.373 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62835.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25636.log 2026-03-23T18:34:58.373 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62835.log.gz 2026-03-23T18:34:58.373 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31928.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31928.log.gz 2026-03-23T18:34:58.373 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43318.log 2026-03-23T18:34:58.373 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.25636.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64796.log 2026-03-23T18:34:58.373 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25636.log.gz 2026-03-23T18:34:58.373 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43318.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43318.log.gz 2026-03-23T18:34:58.374 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61681.log 2026-03-23T18:34:58.374 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64796.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78391.log 2026-03-23T18:34:58.374 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64796.log.gz 2026-03-23T18:34:58.374 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61681.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61681.log.gz 2026-03-23T18:34:58.374 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84745.log 2026-03-23T18:34:58.375 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78391.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75873.log 2026-03-23T18:34:58.375 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78391.log.gz 2026-03-23T18:34:58.375 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84745.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84745.log.gz 2026-03-23T18:34:58.375 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79232.log 2026-03-23T18:34:58.375 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75873.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27512.log 2026-03-23T18:34:58.375 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75873.log.gz 2026-03-23T18:34:58.376 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79232.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79232.log.gz 2026-03-23T18:34:58.376 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41936.log 2026-03-23T18:34:58.376 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27512.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80468.log 2026-03-23T18:34:58.376 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27512.log.gz 2026-03-23T18:34:58.376 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.41936.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.41936.log.gz 2026-03-23T18:34:58.377 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86063.log 2026-03-23T18:34:58.377 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80468.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47291.log 2026-03-23T18:34:58.377 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80468.log.gz 2026-03-23T18:34:58.377 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86063.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86063.log.gz 2026-03-23T18:34:58.377 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84358.log 2026-03-23T18:34:58.377 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.47291.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47291.log.gz 2026-03-23T18:34:58.378 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77914.log 2026-03-23T18:34:58.378 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84358.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26311.log 2026-03-23T18:34:58.378 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84358.log.gz 2026-03-23T18:34:58.378 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77914.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77914.log.gz 2026-03-23T18:34:58.378 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62433.log 2026-03-23T18:34:58.378 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.26311.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26311.log.gz 2026-03-23T18:34:58.379 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69416.log 2026-03-23T18:34:58.379 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62433.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35224.log 2026-03-23T18:34:58.379 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62433.log.gz 2026-03-23T18:34:58.379 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69416.log: gzip -5 0.0% --verbose -- -- replaced with /var/log/ceph/ceph-client.admin.69416.log.gz /var/log/ceph/ceph-client.admin.46507.log 2026-03-23T18:34:58.379 INFO:teuthology.orchestra.run.vm04.stderr: 2026-03-23T18:34:58.380 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.35224.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.35224.log.gz 2026-03-23T18:34:58.380 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47377.log 2026-03-23T18:34:58.380 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46507.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90239.log 2026-03-23T18:34:58.380 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46507.log.gz 2026-03-23T18:34:58.380 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.47377.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47377.log.gz 2026-03-23T18:34:58.380 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77512.log 2026-03-23T18:34:58.381 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90239.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30624.log 2026-03-23T18:34:58.381 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90239.log.gz 2026-03-23T18:34:58.381 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77512.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77512.log.gz 2026-03-23T18:34:58.381 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67005.log 2026-03-23T18:34:58.381 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.30624.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50798.log 2026-03-23T18:34:58.381 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30624.log.gz 2026-03-23T18:34:58.382 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67005.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67005.log.gz 2026-03-23T18:34:58.382 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35869.log 2026-03-23T18:34:58.382 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50798.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63942.log 2026-03-23T18:34:58.382 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50798.log.gz 2026-03-23T18:34:58.382 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.35869.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35869.log.gz 2026-03-23T18:34:58.383 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89637.log 2026-03-23T18:34:58.383 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63942.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67134.log 2026-03-23T18:34:58.383 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63942.log.gz 2026-03-23T18:34:58.383 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89637.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89637.log.gz 2026-03-23T18:34:58.383 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44209.log 2026-03-23T18:34:58.383 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67134.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83252.log 2026-03-23T18:34:58.384 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67134.log.gz 2026-03-23T18:34:58.384 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44209.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44209.log.gz 2026-03-23T18:34:58.384 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49127.log 2026-03-23T18:34:58.384 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83252.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39051.log 2026-03-23T18:34:58.384 INFO:teuthology.orchestra.run.vm04.stderr: 88.4% -- replaced with /var/log/ceph/ceph-client.admin.83252.log.gz 2026-03-23T18:34:58.384 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49127.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49127.log.gz 2026-03-23T18:34:58.385 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29034.log 2026-03-23T18:34:58.385 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.39051.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75658.log 2026-03-23T18:34:58.385 INFO:teuthology.orchestra.run.vm04.stderr: 26.4%/var/log/ceph/ceph-client.admin.29034.log: -- replaced with /var/log/ceph/ceph-client.admin.39051.log.gz 2026-03-23T18:34:58.385 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29034.log.gz 2026-03-23T18:34:58.385 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69354.log 2026-03-23T18:34:58.385 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75658.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29355.log 2026-03-23T18:34:58.385 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75658.log.gz 2026-03-23T18:34:58.386 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69354.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67942.log 2026-03-23T18:34:58.386 INFO:teuthology.orchestra.run.vm04.stderr: 12.1% -- replaced with /var/log/ceph/ceph-client.admin.69354.log.gz 2026-03-23T18:34:58.386 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29355.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29355.log.gz 2026-03-23T18:34:58.386 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85655.log 2026-03-23T18:34:58.387 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67942.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67942.log.gz 2026-03-23T18:34:58.387 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42523.log 2026-03-23T18:34:58.387 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85655.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72936.log 2026-03-23T18:34:58.387 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85655.log.gz 2026-03-23T18:34:58.387 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42523.log: 25.4% -- replaced with /var/log/ceph/ceph-client.admin.42523.log.gz 2026-03-23T18:34:58.387 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56720.log 2026-03-23T18:34:58.388 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72936.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55967.log 2026-03-23T18:34:58.388 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72936.log.gz 2026-03-23T18:34:58.388 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56720.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56720.log.gz 2026-03-23T18:34:58.388 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64104.log 2026-03-23T18:34:58.388 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.55967.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55967.log.gz 2026-03-23T18:34:58.388 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74420.log 2026-03-23T18:34:58.389 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64104.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53933.log 2026-03-23T18:34:58.389 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64104.log.gz 2026-03-23T18:34:58.389 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74420.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74420.log.gz 2026-03-23T18:34:58.389 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35971.log 2026-03-23T18:34:58.389 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.53933.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54975.log 2026-03-23T18:34:58.389 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53933.log.gz 2026-03-23T18:34:58.389 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.35971.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35971.log.gz 2026-03-23T18:34:58.389 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80619.log 2026-03-23T18:34:58.390 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.54975.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46942.log 2026-03-23T18:34:58.390 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54975.log.gz 2026-03-23T18:34:58.390 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80619.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80619.log.gz 2026-03-23T18:34:58.390 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37161.log 2026-03-23T18:34:58.390 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46942.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46942.log.gz 2026-03-23T18:34:58.390 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59093.log 2026-03-23T18:34:58.390 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39741.log 2026-03-23T18:34:58.391 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37161.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37161.log.gz 2026-03-23T18:34:58.391 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59093.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59093.log.gz 2026-03-23T18:34:58.391 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59136.log 2026-03-23T18:34:58.391 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.39741.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80748.log 2026-03-23T18:34:58.391 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39741.log.gz 2026-03-23T18:34:58.391 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59136.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59136.log.gz 2026-03-23T18:34:58.391 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77015.log 2026-03-23T18:34:58.392 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80748.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73982.log 2026-03-23T18:34:58.392 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80748.log.gz 2026-03-23T18:34:58.392 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77015.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77015.log.gz 2026-03-23T18:34:58.392 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78215.log 2026-03-23T18:34:58.392 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73982.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46164.log 2026-03-23T18:34:58.392 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73982.log.gz 2026-03-23T18:34:58.392 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78215.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78215.log.gz 2026-03-23T18:34:58.392 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68049.log 2026-03-23T18:34:58.393 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46164.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46164.log.gz 2026-03-23T18:34:58.393 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47528.log 2026-03-23T18:34:58.393 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68049.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73215.log 2026-03-23T18:34:58.393 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68049.log.gz 2026-03-23T18:34:58.393 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.47528.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47528.log.gz 2026-03-23T18:34:58.393 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26200.log 2026-03-23T18:34:58.394 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73215.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73215.log.gz 2026-03-23T18:34:58.394 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80705.log 2026-03-23T18:34:58.394 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.26200.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73258.log 2026-03-23T18:34:58.394 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26200.log.gz 2026-03-23T18:34:58.394 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29784.log 2026-03-23T18:34:58.395 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80705.log: /var/log/ceph/ceph-client.admin.73258.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80705.log.gz 2026-03-23T18:34:58.395 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73258.log.gz 2026-03-23T18:34:58.395 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78301.log 2026-03-23T18:34:58.395 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29784.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71716.log 2026-03-23T18:34:58.395 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29784.log.gz 2026-03-23T18:34:58.395 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78301.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78301.log.gz 2026-03-23T18:34:58.395 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77807.log 2026-03-23T18:34:58.396 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71716.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35602.log 2026-03-23T18:34:58.396 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71716.log.gz 2026-03-23T18:34:58.396 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77807.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77807.log.gz 2026-03-23T18:34:58.396 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53761.log 2026-03-23T18:34:58.396 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.35602.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84422.log 2026-03-23T18:34:58.396 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35602.log.gz 2026-03-23T18:34:58.396 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.53761.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53761.log.gz 2026-03-23T18:34:58.396 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32098.log 2026-03-23T18:34:58.397 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84422.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58964.log 2026-03-23T18:34:58.397 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84422.log.gz 2026-03-23T18:34:58.397 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32098.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32098.log.gz 2026-03-23T18:34:58.397 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35203.log 2026-03-23T18:34:58.397 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58964.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46724.log 2026-03-23T18:34:58.397 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58964.log.gz 2026-03-23T18:34:58.397 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.35203.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35203.log.gz 2026-03-23T18:34:58.397 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35308.log 2026-03-23T18:34:58.397 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46724.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86302.log 2026-03-23T18:34:58.398 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46724.log.gz 2026-03-23T18:34:58.398 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.35308.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75000.log 2026-03-23T18:34:58.398 INFO:teuthology.orchestra.run.vm04.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.35308.log.gz 2026-03-23T18:34:58.398 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86302.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86302.log.gz 2026-03-23T18:34:58.398 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66726.log 2026-03-23T18:34:58.398 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75000.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69334.log 2026-03-23T18:34:58.399 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75000.log.gz 2026-03-23T18:34:58.399 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66726.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66726.log.gz 2026-03-23T18:34:58.399 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34426.log 2026-03-23T18:34:58.399 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69334.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29269.log 2026-03-23T18:34:58.399 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69334.log.gz 2026-03-23T18:34:58.399 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34426.log: 27.1% -- replaced with /var/log/ceph/ceph-client.admin.34426.log.gz 2026-03-23T18:34:58.399 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37365.log 2026-03-23T18:34:58.400 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29269.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34594.log 2026-03-23T18:34:58.400 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29269.log.gz 2026-03-23T18:34:58.400 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37365.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37365.log.gz 2026-03-23T18:34:58.400 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79998.log 2026-03-23T18:34:58.400 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34594.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27494.log 2026-03-23T18:34:58.400 INFO:teuthology.orchestra.run.vm04.stderr: 27.1% -- replaced with /var/log/ceph/ceph-client.admin.34594.log.gz 2026-03-23T18:34:58.401 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79998.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79998.log.gz 2026-03-23T18:34:58.401 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28284.log 2026-03-23T18:34:58.401 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27494.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70028.log 2026-03-23T18:34:58.401 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27494.log.gz 2026-03-23T18:34:58.401 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.28284.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28284.log.gz 2026-03-23T18:34:58.401 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33317.log 2026-03-23T18:34:58.402 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70028.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26806.log 2026-03-23T18:34:58.402 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70028.log.gz 2026-03-23T18:34:58.402 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33317.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33317.log.gz 2026-03-23T18:34:58.402 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42422.log 2026-03-23T18:34:58.402 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.26806.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46635.log 2026-03-23T18:34:58.402 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26806.log.gz 2026-03-23T18:34:58.403 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42422.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33880.log 2026-03-23T18:34:58.403 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42422.log.gz 2026-03-23T18:34:58.403 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46635.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46635.log.gz 2026-03-23T18:34:58.403 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34195.log 2026-03-23T18:34:58.403 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33880.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26956.log 2026-03-23T18:34:58.404 INFO:teuthology.orchestra.run.vm04.stderr: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.33880.log.gz 2026-03-23T18:34:58.404 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34195.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38018.log 2026-03-23T18:34:58.404 INFO:teuthology.orchestra.run.vm04.stderr: 25.7% -- replaced with /var/log/ceph/ceph-client.admin.34195.log.gz 2026-03-23T18:34:58.404 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.26956.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26956.log.gz 2026-03-23T18:34:58.404 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55687.log 2026-03-23T18:34:58.404 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83643.log 2026-03-23T18:34:58.405 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.38018.log: /var/log/ceph/ceph-client.admin.55687.log: 25.7% -- replaced with /var/log/ceph/ceph-client.admin.38018.log.gz 2026-03-23T18:34:58.405 INFO:teuthology.orchestra.run.vm04.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.55687.log.gz 2026-03-23T18:34:58.405 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87939.log 2026-03-23T18:34:58.405 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29999.log 2026-03-23T18:34:58.405 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83643.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83643.log.gz 2026-03-23T18:34:58.405 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87939.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87939.log.gz 2026-03-23T18:34:58.405 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47937.log 2026-03-23T18:34:58.406 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29999.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59673.log 2026-03-23T18:34:58.406 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29999.log.gz 2026-03-23T18:34:58.406 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68157.log 2026-03-23T18:34:58.406 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.47937.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47937.log.gz 2026-03-23T18:34:58.406 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59673.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59673.log.gz 2026-03-23T18:34:58.406 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35988.log 2026-03-23T18:34:58.407 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68157.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83536.log 2026-03-23T18:34:58.407 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68157.log.gz 2026-03-23T18:34:58.407 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.35988.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35988.log.gz 2026-03-23T18:34:58.407 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78777.log 2026-03-23T18:34:58.407 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83536.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30257.log 2026-03-23T18:34:58.407 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83536.log.gz 2026-03-23T18:34:58.407 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78777.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78777.log.gz 2026-03-23T18:34:58.408 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44430.log 2026-03-23T18:34:58.408 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.30257.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69103.log 2026-03-23T18:34:58.408 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30257.log.gz 2026-03-23T18:34:58.408 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44430.log: 54.8% -- replaced with /var/log/ceph/ceph-client.admin.44430.log.gz 2026-03-23T18:34:58.408 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28327.log 2026-03-23T18:34:58.409 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69103.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77410.log 2026-03-23T18:34:58.409 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69103.log.gz 2026-03-23T18:34:58.409 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.28327.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28327.log.gz 2026-03-23T18:34:58.409 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47897.log 2026-03-23T18:34:58.409 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77410.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55666.log 2026-03-23T18:34:58.409 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77410.log.gz 2026-03-23T18:34:58.409 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.47897.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47897.log.gz 2026-03-23T18:34:58.410 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78172.log 2026-03-23T18:34:58.410 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.55666.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63862.log 2026-03-23T18:34:58.410 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55666.log.gz 2026-03-23T18:34:58.410 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78172.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78172.log.gz 2026-03-23T18:34:58.411 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88269.log 2026-03-23T18:34:58.411 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63862.log: gzip -5 --verbose -- 0.0% /var/log/ceph/ceph-client.admin.27107.log 2026-03-23T18:34:58.411 INFO:teuthology.orchestra.run.vm04.stderr: -- replaced with /var/log/ceph/ceph-client.admin.63862.log.gz 2026-03-23T18:34:58.411 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88269.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42285.log 2026-03-23T18:34:58.411 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88269.log.gz 2026-03-23T18:34:58.411 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27107.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27107.log.gz 2026-03-23T18:34:58.411 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65654.log 2026-03-23T18:34:58.412 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42285.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43361.log 2026-03-23T18:34:58.412 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42285.log.gz 2026-03-23T18:34:58.412 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65654.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65654.log.gz 2026-03-23T18:34:58.412 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42746.log 2026-03-23T18:34:58.412 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43361.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29118.log 2026-03-23T18:34:58.412 INFO:teuthology.orchestra.run.vm04.stderr: 56.0% -- replaced with /var/log/ceph/ceph-client.admin.43361.log.gz 2026-03-23T18:34:58.413 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42746.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42746.log.gz 2026-03-23T18:34:58.413 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60063.log 2026-03-23T18:34:58.413 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29118.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35707.log 2026-03-23T18:34:58.413 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29118.log.gz 2026-03-23T18:34:58.413 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60063.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60063.log.gz 2026-03-23T18:34:58.413 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67770.log 2026-03-23T18:34:58.413 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.35707.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80146.log 2026-03-23T18:34:58.413 INFO:teuthology.orchestra.run.vm04.stderr: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.35707.log.gz 2026-03-23T18:34:58.414 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67770.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67770.log.gz 2026-03-23T18:34:58.414 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37913.log 2026-03-23T18:34:58.414 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80146.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36005.log 2026-03-23T18:34:58.414 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80146.log.gz 2026-03-23T18:34:58.414 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37913.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26935.log 2026-03-23T18:34:58.414 INFO:teuthology.orchestra.run.vm04.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.37913.log.gz 2026-03-23T18:34:58.414 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36005.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36005.log.gz 2026-03-23T18:34:58.415 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84766.log 2026-03-23T18:34:58.415 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.26935.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59244.log 2026-03-23T18:34:58.415 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26935.log.gz 2026-03-23T18:34:58.415 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84766.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84766.log.gz 2026-03-23T18:34:58.415 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53998.log 2026-03-23T18:34:58.416 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59244.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74264.log 2026-03-23T18:34:58.416 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59244.log.gz 2026-03-23T18:34:58.416 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.53998.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81051.log 2026-03-23T18:34:58.416 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53998.log.gz 2026-03-23T18:34:58.416 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74264.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74264.log.gz 2026-03-23T18:34:58.416 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88774.log 2026-03-23T18:34:58.416 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81051.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63218.log 2026-03-23T18:34:58.417 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81051.log.gz 2026-03-23T18:34:58.417 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88774.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42642.log 2026-03-23T18:34:58.417 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88774.log.gz 2026-03-23T18:34:58.417 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63218.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63218.log.gz 2026-03-23T18:34:58.417 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45000.log 2026-03-23T18:34:58.417 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42642.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38673.log 2026-03-23T18:34:58.417 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42642.log.gz 2026-03-23T18:34:58.418 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.45000.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43256.log 2026-03-23T18:34:58.418 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45000.log.gz 2026-03-23T18:34:58.418 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.38673.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38673.log.gz 2026-03-23T18:34:58.418 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.22772.log 2026-03-23T18:34:58.418 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43256.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43256.log.gz 2026-03-23T18:34:58.418 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34846.log 2026-03-23T18:34:58.419 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.22772.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63468.log 2026-03-23T18:34:58.419 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.22772.log.gz 2026-03-23T18:34:58.419 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34846.log: 25.7% -- replaced with /var/log/ceph/ceph-client.admin.34846.log.gz 2026-03-23T18:34:58.419 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31945.log 2026-03-23T18:34:58.419 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63468.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78237.log 2026-03-23T18:34:58.419 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63468.log.gz 2026-03-23T18:34:58.420 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31945.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31945.log.gz 2026-03-23T18:34:58.420 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88863.log 2026-03-23T18:34:58.420 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78237.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38060.log 2026-03-23T18:34:58.420 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78237.log.gz 2026-03-23T18:34:58.420 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88863.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88863.log.gz 2026-03-23T18:34:58.420 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51648.log 2026-03-23T18:34:58.421 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.38060.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37976.log 2026-03-23T18:34:58.421 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.51648.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51648.log.gz 2026-03-23T18:34:58.421 INFO:teuthology.orchestra.run.vm04.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.38060.log.gz 2026-03-23T18:34:58.421 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56742.log 2026-03-23T18:34:58.421 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37976.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60846.log 2026-03-23T18:34:58.421 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56742.log: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.37976.log.gz 2026-03-23T18:34:58.421 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56742.log.gz 2026-03-23T18:34:58.421 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45709.log 2026-03-23T18:34:58.422 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60846.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43671.log 2026-03-23T18:34:58.422 INFO:teuthology.orchestra.run.vm04.stderr: 10.7% -- replaced with /var/log/ceph/ceph-client.admin.60846.log.gz 2026-03-23T18:34:58.422 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.45709.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45709.log.gz 2026-03-23T18:34:58.422 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91645.log 2026-03-23T18:34:58.423 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43671.log: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.43671.log.gz 2026-03-23T18:34:58.423 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36634.log 2026-03-23T18:34:58.423 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91645.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66089.log 2026-03-23T18:34:58.423 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91645.log.gz 2026-03-23T18:34:58.423 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36634.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36634.log.gz 2026-03-23T18:34:58.423 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53461.log 2026-03-23T18:34:58.424 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66089.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55494.log 2026-03-23T18:34:58.424 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66089.log.gz 2026-03-23T18:34:58.424 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.53461.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53461.log.gz 2026-03-23T18:34:58.424 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49342.log 2026-03-23T18:34:58.424 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.55494.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65161.log 2026-03-23T18:34:58.424 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55494.log.gz 2026-03-23T18:34:58.424 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49342.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49342.log.gz 2026-03-23T18:34:58.424 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35518.log 2026-03-23T18:34:58.425 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65161.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63902.log 2026-03-23T18:34:58.425 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65161.log.gz 2026-03-23T18:34:58.425 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.35518.log: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.35518.log.gz 2026-03-23T18:34:58.425 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62391.log 2026-03-23T18:34:58.425 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63902.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36447.log 2026-03-23T18:34:58.425 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63902.log.gz 2026-03-23T18:34:58.426 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62391.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62391.log.gz 2026-03-23T18:34:58.426 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70046.log 2026-03-23T18:34:58.426 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36447.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28812.log 2026-03-23T18:34:58.426 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36447.log.gz 2026-03-23T18:34:58.426 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70046.log: 58.2% -- replaced with /var/log/ceph/ceph-client.admin.70046.log.gz 2026-03-23T18:34:58.427 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36991.log 2026-03-23T18:34:58.427 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.28812.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54422.log 2026-03-23T18:34:58.427 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28812.log.gz 2026-03-23T18:34:58.427 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36991.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36991.log.gz 2026-03-23T18:34:58.427 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45516.log 2026-03-23T18:34:58.427 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.54422.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74496.log 2026-03-23T18:34:58.427 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54422.log.gz 2026-03-23T18:34:58.428 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.45516.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45516.log.gz 2026-03-23T18:34:58.428 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34951.log 2026-03-23T18:34:58.428 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78640.log 2026-03-23T18:34:58.428 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74496.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74496.log.gz 2026-03-23T18:34:58.429 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34951.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47743.log 2026-03-23T18:34:58.429 INFO:teuthology.orchestra.run.vm04.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.34951.log.gz 2026-03-23T18:34:58.429 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78640.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78640.log.gz 2026-03-23T18:34:58.429 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35476.log 2026-03-23T18:34:58.430 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.47743.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69374.log 2026-03-23T18:34:58.430 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47743.log.gz 2026-03-23T18:34:58.430 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.35476.log: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.35476.log.gz 2026-03-23T18:34:58.430 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67521.log 2026-03-23T18:34:58.430 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69374.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69374.log.gz 2026-03-23T18:34:58.431 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65440.log 2026-03-23T18:34:58.431 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67521.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84138.log 2026-03-23T18:34:58.431 INFO:teuthology.orchestra.run.vm04.stderr: 25.8% -- replaced with /var/log/ceph/ceph-client.admin.67521.log.gz 2026-03-23T18:34:58.431 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65440.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65440.log.gz 2026-03-23T18:34:58.432 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63745.log 2026-03-23T18:34:58.432 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5/var/log/ceph/ceph-client.admin.84138.log: --verbose -- /var/log/ceph/ceph-client.admin.35119.log 2026-03-23T18:34:58.432 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84138.log.gz 2026-03-23T18:34:58.432 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63745.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63745.log.gz 2026-03-23T18:34:58.432 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44671.log 2026-03-23T18:34:58.432 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.35119.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60218.log 2026-03-23T18:34:58.433 INFO:teuthology.orchestra.run.vm04.stderr: 25.7% -- replaced with /var/log/ceph/ceph-client.admin.35119.log.gz 2026-03-23T18:34:58.433 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44671.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44671.log.gz 2026-03-23T18:34:58.433 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63630.log 2026-03-23T18:34:58.433 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60218.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68737.log 2026-03-23T18:34:58.433 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60218.log.gz 2026-03-23T18:34:58.433 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63630.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63630.log.gz 2026-03-23T18:34:58.433 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50267.log 2026-03-23T18:34:58.434 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68737.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89680.log 2026-03-23T18:34:58.434 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68737.log.gz 2026-03-23T18:34:58.434 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50267.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50267.log.gz 2026-03-23T18:34:58.434 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81610.log 2026-03-23T18:34:58.435 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89680.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80253.log 2026-03-23T18:34:58.435 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89680.log.gz 2026-03-23T18:34:58.435 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81610.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81610.log.gz 2026-03-23T18:34:58.435 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60582.log 2026-03-23T18:34:58.435 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80253.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.80253.log.gz 2026-03-23T18:34:58.435 INFO:teuthology.orchestra.run.vm04.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.43050.log 2026-03-23T18:34:58.436 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60582.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60582.log.gz 2026-03-23T18:34:58.436 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64491.log 2026-03-23T18:34:58.436 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43050.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43050.log.gz 2026-03-23T18:34:58.436 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27259.log 2026-03-23T18:34:58.437 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64491.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26505.log 2026-03-23T18:34:58.437 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64491.log.gz 2026-03-23T18:34:58.437 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27259.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27259.log.gz 2026-03-23T18:34:58.437 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51540.log 2026-03-23T18:34:58.437 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.26505.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26505.log.gz 2026-03-23T18:34:58.437 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30020.log 2026-03-23T18:34:58.438 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.51540.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51540.log.gzgzip 2026-03-23T18:34:58.438 INFO:teuthology.orchestra.run.vm04.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.51390.log 2026-03-23T18:34:58.438 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.30020.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30020.log.gz 2026-03-23T18:34:58.438 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27646.log 2026-03-23T18:34:58.438 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.51390.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51390.log.gzgzip 2026-03-23T18:34:58.438 INFO:teuthology.orchestra.run.vm04.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.64202.log 2026-03-23T18:34:58.439 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27646.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27646.log.gz 2026-03-23T18:34:58.439 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40858.log 2026-03-23T18:34:58.439 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64202.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55859.log 2026-03-23T18:34:58.439 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64202.log.gz 2026-03-23T18:34:58.439 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.40858.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40858.log.gz 2026-03-23T18:34:58.439 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55282.log 2026-03-23T18:34:58.439 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.55859.log: gzip -5 --verbose 0.0% -- -- replaced with /var/log/ceph/ceph-client.admin.55859.log.gz /var/log/ceph/ceph-client.admin.54789.log 2026-03-23T18:34:58.439 INFO:teuthology.orchestra.run.vm04.stderr: 2026-03-23T18:34:58.440 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.55282.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55282.log.gz 2026-03-23T18:34:58.440 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62776.log 2026-03-23T18:34:58.440 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.54789.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79627.log 2026-03-23T18:34:58.440 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54789.log.gz 2026-03-23T18:34:58.440 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62776.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62776.log.gz 2026-03-23T18:34:58.440 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76153.log 2026-03-23T18:34:58.441 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79627.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43969.log 2026-03-23T18:34:58.441 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79627.log.gz 2026-03-23T18:34:58.441 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76153.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76153.log.gz 2026-03-23T18:34:58.441 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70474.log 2026-03-23T18:34:58.441 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43969.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42502.log 2026-03-23T18:34:58.441 INFO:teuthology.orchestra.run.vm04.stderr: 31.5% -- replaced with /var/log/ceph/ceph-client.admin.43969.log.gz 2026-03-23T18:34:58.441 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70474.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70474.log.gz 2026-03-23T18:34:58.442 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73151.log 2026-03-23T18:34:58.442 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42502.log: gzip 0.0% -5 -- replaced with /var/log/ceph/ceph-client.admin.42502.log.gz --verbose 2026-03-23T18:34:58.442 INFO:teuthology.orchestra.run.vm04.stderr: -- /var/log/ceph/ceph-client.admin.73043.log 2026-03-23T18:34:58.442 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73151.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73151.log.gz 2026-03-23T18:34:58.442 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64987.log 2026-03-23T18:34:58.442 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29569.log 2026-03-23T18:34:58.443 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64987.log: /var/log/ceph/ceph-client.admin.73043.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64987.log.gz 2026-03-23T18:34:58.443 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73043.log.gz 2026-03-23T18:34:58.443 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46078.log 2026-03-23T18:34:58.443 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29569.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29569.log.gz 2026-03-23T18:34:58.443 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30662.log 2026-03-23T18:34:58.444 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46078.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46078.log.gz 2026-03-23T18:34:58.444 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46357.log 2026-03-23T18:34:58.444 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.30662.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70667.log 2026-03-23T18:34:58.444 INFO:teuthology.orchestra.run.vm04.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.30662.log.gz 2026-03-23T18:34:58.444 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46357.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46357.log.gz 2026-03-23T18:34:58.444 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27044.log 2026-03-23T18:34:58.444 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70667.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70667.log.gz 2026-03-23T18:34:58.445 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71135.log 2026-03-23T18:34:58.445 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27044.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27044.log.gz 2026-03-23T18:34:58.445 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50714.log 2026-03-23T18:34:58.445 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71135.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41851.log 2026-03-23T18:34:58.445 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71135.log.gz 2026-03-23T18:34:58.445 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50714.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50714.log.gz 2026-03-23T18:34:58.445 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81992.log 2026-03-23T18:34:58.446 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.41851.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54213.log 2026-03-23T18:34:58.446 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41851.log.gz 2026-03-23T18:34:58.446 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81992.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81992.log.gz 2026-03-23T18:34:58.446 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68178.log 2026-03-23T18:34:58.446 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83146.log 2026-03-23T18:34:58.446 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.54213.log: /var/log/ceph/ceph-client.admin.68178.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54213.log.gz 2026-03-23T18:34:58.446 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68178.log.gz 2026-03-23T18:34:58.447 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36464.log 2026-03-23T18:34:58.447 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83146.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83146.log.gz 2026-03-23T18:34:58.447 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72215.log 2026-03-23T18:34:58.447 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36464.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36464.log.gz 2026-03-23T18:34:58.447 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38102.log 2026-03-23T18:34:58.448 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72215.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36838.log 2026-03-23T18:34:58.448 INFO:teuthology.orchestra.run.vm04.stderr: 54.6% -- replaced with /var/log/ceph/ceph-client.admin.72215.log.gz 2026-03-23T18:34:58.448 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.38102.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38102.log.gz 2026-03-23T18:34:58.448 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41270.log 2026-03-23T18:34:58.448 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36838.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36838.log.gz 2026-03-23T18:34:58.448 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65007.log 2026-03-23T18:34:58.448 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.41270.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41270.log.gz 2026-03-23T18:34:58.449 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62524.log 2026-03-23T18:34:58.449 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65007.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46056.log 2026-03-23T18:34:58.449 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65007.log.gz 2026-03-23T18:34:58.449 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62524.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62524.log.gz 2026-03-23T18:34:58.449 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75034.log 2026-03-23T18:34:58.449 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46056.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46056.log.gz 2026-03-23T18:34:58.449 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30128.log 2026-03-23T18:34:58.450 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75034.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75034.log.gz 2026-03-23T18:34:58.450 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37787.log 2026-03-23T18:34:58.450 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.30128.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36141.log 2026-03-23T18:34:58.450 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30128.log.gz 2026-03-23T18:34:58.450 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37787.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28198.log 2026-03-23T18:34:58.450 INFO:teuthology.orchestra.run.vm04.stderr: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.37787.log.gz 2026-03-23T18:34:58.451 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36141.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36141.log.gz 2026-03-23T18:34:58.451 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72153.log 2026-03-23T18:34:58.451 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.28198.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28198.log.gz 2026-03-23T18:34:58.451 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48017.log 2026-03-23T18:34:58.451 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72153.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72153.log.gz 2026-03-23T18:34:58.451 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37042.log 2026-03-23T18:34:58.452 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.48017.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.48017.log.gz 2026-03-23T18:34:58.452 INFO:teuthology.orchestra.run.vm04.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.64571.log 2026-03-23T18:34:58.452 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37042.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37042.log.gz 2026-03-23T18:34:58.452 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39360.log 2026-03-23T18:34:58.452 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64571.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.64571.log.gz -5 2026-03-23T18:34:58.452 INFO:teuthology.orchestra.run.vm04.stderr: --verbose -- /var/log/ceph/ceph-client.admin.89379.log 2026-03-23T18:34:58.453 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.39360.log: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.39360.log.gz 2026-03-23T18:34:58.453 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29741.log 2026-03-23T18:34:58.453 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89379.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89379.log.gz 2026-03-23T18:34:58.453 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51669.log 2026-03-23T18:34:58.453 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29741.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77477.log 2026-03-23T18:34:58.453 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29741.log.gz 2026-03-23T18:34:58.454 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.51669.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.51669.log.gz -5 2026-03-23T18:34:58.454 INFO:teuthology.orchestra.run.vm04.stderr: --verbose -- /var/log/ceph/ceph-client.admin.35835.log 2026-03-23T18:34:58.454 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77477.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77477.log.gz 2026-03-23T18:34:58.455 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75464.log 2026-03-23T18:34:58.455 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.35835.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36430.log 2026-03-23T18:34:58.455 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35835.log.gz 2026-03-23T18:34:58.455 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75464.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75464.log.gz 2026-03-23T18:34:58.455 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61703.log 2026-03-23T18:34:58.455 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36430.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40042.log 2026-03-23T18:34:58.455 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36430.log.gz 2026-03-23T18:34:58.456 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61703.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61703.log.gz 2026-03-23T18:34:58.456 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86603.log 2026-03-23T18:34:58.456 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.40042.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40042.log.gz 2026-03-23T18:34:58.456 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45366.log 2026-03-23T18:34:58.456 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86603.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58306.log 2026-03-23T18:34:58.456 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86603.log.gz 2026-03-23T18:34:58.457 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.45366.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45366.log.gz 2026-03-23T18:34:58.457 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51007.log 2026-03-23T18:34:58.457 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58306.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58306.log.gz 2026-03-23T18:34:58.457 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73640.log 2026-03-23T18:34:58.457 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.51007.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49256.log 2026-03-23T18:34:58.458 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51007.log.gz 2026-03-23T18:34:58.458 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73640.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73640.log.gz 2026-03-23T18:34:58.458 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72031.log 2026-03-23T18:34:58.458 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49256.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.49256.log.gz -5 2026-03-23T18:34:58.458 INFO:teuthology.orchestra.run.vm04.stderr: --verbose -- /var/log/ceph/ceph-client.admin.32660.log 2026-03-23T18:34:58.459 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72031.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72031.log.gz 2026-03-23T18:34:58.459 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39870.log 2026-03-23T18:34:58.459 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32660.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32660.log.gz 2026-03-23T18:34:58.459 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60993.log 2026-03-23T18:34:58.459 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.39870.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39870.log.gz 2026-03-23T18:34:58.460 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65342.log 2026-03-23T18:34:58.460 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60993.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60993.log.gz 2026-03-23T18:34:58.460 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40064.log 2026-03-23T18:34:58.460 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65342.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65342.log.gz 2026-03-23T18:34:58.461 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49044.log 2026-03-23T18:34:58.461 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.40064.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40064.log.gz 2026-03-23T18:34:58.461 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77251.log 2026-03-23T18:34:58.461 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49044.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49044.log.gz 2026-03-23T18:34:58.461 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72957.log 2026-03-23T18:34:58.461 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77251.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77251.log.gz 2026-03-23T18:34:58.461 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82046.log 2026-03-23T18:34:58.462 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72957.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72957.log.gz 2026-03-23T18:34:58.462 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89057.log 2026-03-23T18:34:58.462 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82046.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82046.log.gz 2026-03-23T18:34:58.462 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77548.log 2026-03-23T18:34:58.463 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89057.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89057.log.gz 2026-03-23T18:34:58.463 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35852.log 2026-03-23T18:34:58.463 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77548.log: gzip 0.0% -5 -- replaced with /var/log/ceph/ceph-client.admin.77548.log.gz --verbose 2026-03-23T18:34:58.463 INFO:teuthology.orchestra.run.vm04.stderr: -- /var/log/ceph/ceph-client.admin.65750.log 2026-03-23T18:34:58.463 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.35852.log: gzip 0.0% -5 -- replaced with /var/log/ceph/ceph-client.admin.35852.log.gz --verbose 2026-03-23T18:34:58.463 INFO:teuthology.orchestra.run.vm04.stderr: -- /var/log/ceph/ceph-client.admin.34678.log 2026-03-23T18:34:58.464 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65750.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65750.log.gz 2026-03-23T18:34:58.464 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80232.log 2026-03-23T18:34:58.464 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34678.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47786.log 2026-03-23T18:34:58.464 INFO:teuthology.orchestra.run.vm04.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.34678.log.gz 2026-03-23T18:34:58.464 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80232.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80232.log.gz 2026-03-23T18:34:58.465 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49578.log 2026-03-23T18:34:58.465 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.47786.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47786.log.gz 2026-03-23T18:34:58.465 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61638.log 2026-03-23T18:34:58.465 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49578.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49578.log.gz 2026-03-23T18:34:58.466 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32456.log 2026-03-23T18:34:58.466 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61638.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61638.log.gz 2026-03-23T18:34:58.466 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28778.log 2026-03-23T18:34:58.466 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32456.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32456.log.gz 2026-03-23T18:34:58.466 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83398.log 2026-03-23T18:34:58.467 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.28778.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84336.log 2026-03-23T18:34:58.467 INFO:teuthology.orchestra.run.vm04.stderr: 2.5% -- replaced with /var/log/ceph/ceph-client.admin.28778.log.gz 2026-03-23T18:34:58.467 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83398.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83398.log.gz 2026-03-23T18:34:58.467 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48100.log 2026-03-23T18:34:58.467 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84336.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84336.log.gz 2026-03-23T18:34:58.468 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44530.log 2026-03-23T18:34:58.468 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.48100.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63143.log 2026-03-23T18:34:58.468 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48100.log.gz 2026-03-23T18:34:58.468 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44530.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28499.log 2026-03-23T18:34:58.468 INFO:teuthology.orchestra.run.vm04.stderr: 31.8% -- replaced with /var/log/ceph/ceph-client.admin.44530.log.gz 2026-03-23T18:34:58.468 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63143.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63143.log.gz 2026-03-23T18:34:58.468 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45430.log 2026-03-23T18:34:58.469 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.28499.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46379.log 2026-03-23T18:34:58.469 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28499.log.gz 2026-03-23T18:34:58.469 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.45430.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45430.log.gz 2026-03-23T18:34:58.469 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86539.log 2026-03-23T18:34:58.470 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46379.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32926.log 2026-03-23T18:34:58.470 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46379.log.gz 2026-03-23T18:34:58.470 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86539.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86539.log.gz 2026-03-23T18:34:58.470 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39402.log 2026-03-23T18:34:58.471 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32926.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32926.log.gz 2026-03-23T18:34:58.471 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84202.log 2026-03-23T18:34:58.471 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.39402.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82398.log 2026-03-23T18:34:58.471 INFO:teuthology.orchestra.run.vm04.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.39402.log.gz 2026-03-23T18:34:58.471 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84202.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84202.log.gz 2026-03-23T18:34:58.472 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67192.log 2026-03-23T18:34:58.472 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82398.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77850.log 2026-03-23T18:34:58.472 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82398.log.gz 2026-03-23T18:34:58.472 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67192.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67192.log.gz 2026-03-23T18:34:58.472 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59690.log 2026-03-23T18:34:58.473 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77850.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77850.log.gzgzip 2026-03-23T18:34:58.473 INFO:teuthology.orchestra.run.vm04.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.74321.log 2026-03-23T18:34:58.473 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59690.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59690.log.gz 2026-03-23T18:34:58.473 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74176.log 2026-03-23T18:34:58.473 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74321.log: gzip -5 0.0% --verbose -- -- replaced with /var/log/ceph/ceph-client.admin.74321.log.gz /var/log/ceph/ceph-client.admin.36379.log 2026-03-23T18:34:58.473 INFO:teuthology.orchestra.run.vm04.stderr: 2026-03-23T18:34:58.474 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74176.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74176.log.gz 2026-03-23T18:34:58.474 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27720.log 2026-03-23T18:34:58.474 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36379.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69707.log 2026-03-23T18:34:58.474 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36379.log.gz 2026-03-23T18:34:58.475 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27720.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27720.log.gz 2026-03-23T18:34:58.475 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60335.log 2026-03-23T18:34:58.475 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69707.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51519.log 2026-03-23T18:34:58.475 INFO:teuthology.orchestra.run.vm04.stderr: 12.0% -- replaced with /var/log/ceph/ceph-client.admin.69707.log.gz 2026-03-23T18:34:58.476 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60335.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73706.log 2026-03-23T18:34:58.476 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60335.log.gz 2026-03-23T18:34:58.476 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.51519.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51519.log.gz 2026-03-23T18:34:58.476 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50861.log 2026-03-23T18:34:58.476 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73706.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73706.log.gz 2026-03-23T18:34:58.476 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89121.log 2026-03-23T18:34:58.477 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50861.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87789.log 2026-03-23T18:34:58.477 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50861.log.gz 2026-03-23T18:34:58.477 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89121.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89121.log.gz 2026-03-23T18:34:58.477 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65181.log 2026-03-23T18:34:58.477 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87789.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32388.log 2026-03-23T18:34:58.477 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87789.log.gz 2026-03-23T18:34:58.478 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65181.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65181.log.gz 2026-03-23T18:34:58.478 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46035.log 2026-03-23T18:34:58.478 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32388.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32388.log.gz 2026-03-23T18:34:58.478 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71914.log 2026-03-23T18:34:58.479 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46035.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46035.log.gz 2026-03-23T18:34:58.479 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87235.log 2026-03-23T18:34:58.479 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71914.log: 11.8% -- replaced with /var/log/ceph/ceph-client.admin.71914.log.gz 2026-03-23T18:34:58.479 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60806.log 2026-03-23T18:34:58.479 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87235.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87235.log.gz 2026-03-23T18:34:58.479 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54541.log 2026-03-23T18:34:58.480 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60806.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60806.log.gz 2026-03-23T18:34:58.480 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50375.log 2026-03-23T18:34:58.480 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.54541.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61660.log 2026-03-23T18:34:58.480 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54541.log.gz 2026-03-23T18:34:58.481 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50375.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50375.log.gz 2026-03-23T18:34:58.481 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35937.log 2026-03-23T18:34:58.481 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61660.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80791.log 2026-03-23T18:34:58.481 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61660.log.gz 2026-03-23T18:34:58.481 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.35937.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35937.log.gz 2026-03-23T18:34:58.481 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60645.log 2026-03-23T18:34:58.481 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80791.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80791.log.gz 2026-03-23T18:34:58.482 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27265.log 2026-03-23T18:34:58.482 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60645.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83622.log 2026-03-23T18:34:58.482 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60645.log.gz 2026-03-23T18:34:58.482 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27265.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27265.log.gz 2026-03-23T18:34:58.483 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83514.log 2026-03-23T18:34:58.483 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83622.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83622.log.gz 2026-03-23T18:34:58.483 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87574.log 2026-03-23T18:34:58.483 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55121.log 2026-03-23T18:34:58.483 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83514.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83514.log.gz 2026-03-23T18:34:58.484 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87574.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87574.log.gz 2026-03-23T18:34:58.484 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39255.log 2026-03-23T18:34:58.484 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.55121.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55121.log.gz 2026-03-23T18:34:58.484 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55222.log 2026-03-23T18:34:58.485 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.39255.log: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.39255.log.gz 2026-03-23T18:34:58.485 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35686.log 2026-03-23T18:34:58.485 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.55222.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75270.log 2026-03-23T18:34:58.485 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55222.log.gz 2026-03-23T18:34:58.485 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.35686.log: 25.4% -- replaced with /var/log/ceph/ceph-client.admin.35686.log.gz 2026-03-23T18:34:58.486 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30085.log 2026-03-23T18:34:58.486 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75270.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.75270.log.gz 2026-03-23T18:34:58.486 INFO:teuthology.orchestra.run.vm04.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.66028.log 2026-03-23T18:34:58.486 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.30085.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30085.log.gz 2026-03-23T18:34:58.487 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80919.log 2026-03-23T18:34:58.487 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66028.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37745.log 2026-03-23T18:34:58.487 INFO:teuthology.orchestra.run.vm04.stderr: 54.8% -- replaced with /var/log/ceph/ceph-client.admin.66028.log.gz 2026-03-23T18:34:58.487 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80919.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80919.log.gz 2026-03-23T18:34:58.487 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30300.log 2026-03-23T18:34:58.487 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37745.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46617.log 2026-03-23T18:34:58.488 INFO:teuthology.orchestra.run.vm04.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.37745.log.gz 2026-03-23T18:34:58.488 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.30300.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30300.log.gz 2026-03-23T18:34:58.488 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72477.log 2026-03-23T18:34:58.488 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46617.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71178.log 2026-03-23T18:34:58.488 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46617.log.gz 2026-03-23T18:34:58.488 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72477.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72477.log.gz 2026-03-23T18:34:58.489 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26978.log 2026-03-23T18:34:58.489 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71178.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53976.log 2026-03-23T18:34:58.489 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71178.log.gz 2026-03-23T18:34:58.489 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.26978.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26978.log.gz 2026-03-23T18:34:58.489 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50840.log 2026-03-23T18:34:58.490 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.53976.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53697.log 2026-03-23T18:34:58.490 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53976.log.gz 2026-03-23T18:34:58.490 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50840.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50840.log.gz 2026-03-23T18:34:58.490 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86560.log 2026-03-23T18:34:58.490 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.53697.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42564.log 2026-03-23T18:34:58.490 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53697.log.gz 2026-03-23T18:34:58.491 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86560.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86560.log.gz 2026-03-23T18:34:58.491 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53632.log 2026-03-23T18:34:58.491 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42564.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62948.log 2026-03-23T18:34:58.491 INFO:teuthology.orchestra.run.vm04.stderr: 24.6% -- replaced with /var/log/ceph/ceph-client.admin.42564.log.gz 2026-03-23T18:34:58.491 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.53632.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53632.log.gz 2026-03-23T18:34:58.492 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88090.log 2026-03-23T18:34:58.492 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62948.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72828.log 2026-03-23T18:34:58.492 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62948.log.gz 2026-03-23T18:34:58.492 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88090.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88090.log.gz 2026-03-23T18:34:58.492 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80576.log 2026-03-23T18:34:58.493 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72828.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89358.log 2026-03-23T18:34:58.493 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72828.log.gz 2026-03-23T18:34:58.493 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80576.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80576.log.gz 2026-03-23T18:34:58.493 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74400.log 2026-03-23T18:34:58.493 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89358.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68436.log 2026-03-23T18:34:58.493 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89358.log.gz 2026-03-23T18:34:58.493 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74400.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74400.log.gz 2026-03-23T18:34:58.493 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84659.log 2026-03-23T18:34:58.494 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68436.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47162.log 2026-03-23T18:34:58.494 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68436.log.gz 2026-03-23T18:34:58.494 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84659.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84659.log.gz 2026-03-23T18:34:58.494 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64045.log 2026-03-23T18:34:58.495 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.47162.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88390.log 2026-03-23T18:34:58.495 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47162.log.gz 2026-03-23T18:34:58.495 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64045.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48570.log 2026-03-23T18:34:58.495 INFO:teuthology.orchestra.run.vm04.stderr: 58.1% -- replaced with /var/log/ceph/ceph-client.admin.64045.log.gz 2026-03-23T18:34:58.495 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88390.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88390.log.gz 2026-03-23T18:34:58.495 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32303.log 2026-03-23T18:34:58.496 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.48570.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72653.log 2026-03-23T18:34:58.496 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48570.log.gz 2026-03-23T18:34:58.496 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32303.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32303.log.gz 2026-03-23T18:34:58.496 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84530.log 2026-03-23T18:34:58.496 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72653.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82548.log 2026-03-23T18:34:58.496 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72653.log.gz 2026-03-23T18:34:58.497 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84530.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84530.log.gz 2026-03-23T18:34:58.497 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31723.log 2026-03-23T18:34:58.497 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82548.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66108.log 2026-03-23T18:34:58.497 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82548.log.gz 2026-03-23T18:34:58.497 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31723.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31723.log.gz 2026-03-23T18:34:58.497 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79425.log 2026-03-23T18:34:58.498 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66108.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56074.log 2026-03-23T18:34:58.498 INFO:teuthology.orchestra.run.vm04.stderr: 56.2% -- replaced with /var/log/ceph/ceph-client.admin.66108.log.gz 2026-03-23T18:34:58.498 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79425.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79425.log.gz 2026-03-23T18:34:58.498 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66532.log 2026-03-23T18:34:58.498 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56074.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29054.log 2026-03-23T18:34:58.498 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56074.log.gz 2026-03-23T18:34:58.499 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66532.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66532.log.gz 2026-03-23T18:34:58.499 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27322.log 2026-03-23T18:34:58.499 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29054.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76929.log 2026-03-23T18:34:58.499 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29054.log.gz 2026-03-23T18:34:58.499 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27322.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83708.log 2026-03-23T18:34:58.499 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27322.log.gz 2026-03-23T18:34:58.500 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76929.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76929.log.gz 2026-03-23T18:34:58.500 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70302.log 2026-03-23T18:34:58.500 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83708.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78500.log 2026-03-23T18:34:58.500 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83708.log.gz 2026-03-23T18:34:58.500 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70302.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87466.log 2026-03-23T18:34:58.500 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70302.log.gz 2026-03-23T18:34:58.500 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78500.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78500.log.gz 2026-03-23T18:34:58.501 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89788.log 2026-03-23T18:34:58.501 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87466.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63357.log 2026-03-23T18:34:58.501 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87466.log.gz 2026-03-23T18:34:58.501 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89788.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75852.log 2026-03-23T18:34:58.501 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89788.log.gz 2026-03-23T18:34:58.501 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63357.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63357.log.gz 2026-03-23T18:34:58.502 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53396.log 2026-03-23T18:34:58.502 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75852.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33385.log 2026-03-23T18:34:58.502 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75852.log.gz 2026-03-23T18:34:58.502 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.53396.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53396.log.gz 2026-03-23T18:34:58.503 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72850.log 2026-03-23T18:34:58.503 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33385.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30968.log 2026-03-23T18:34:58.503 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33385.log.gz 2026-03-23T18:34:58.503 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72850.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72850.log.gz 2026-03-23T18:34:58.503 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88047.log 2026-03-23T18:34:58.503 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.30968.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44450.log 2026-03-23T18:34:58.503 INFO:teuthology.orchestra.run.vm04.stderr: 26.6% -- replaced with /var/log/ceph/ceph-client.admin.30968.log.gz 2026-03-23T18:34:58.503 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88047.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88047.log.gz 2026-03-23T18:34:58.504 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30905.log 2026-03-23T18:34:58.504 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44450.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40300.log 2026-03-23T18:34:58.504 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.30905.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87767.log 2026-03-23T18:34:58.504 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30905.log.gz 2026-03-23T18:34:58.504 INFO:teuthology.orchestra.run.vm04.stderr: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.44450.log.gz 2026-03-23T18:34:58.505 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31701.log 2026-03-23T18:34:58.505 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.40300.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40300.log.gz 2026-03-23T18:34:58.505 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87767.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87767.log.gz 2026-03-23T18:34:58.505 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76585.log 2026-03-23T18:34:58.505 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31701.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79124.log 2026-03-23T18:34:58.505 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76585.log: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.31701.log.gz 2026-03-23T18:34:58.505 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76585.log.gz 2026-03-23T18:34:58.505 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26462.log 2026-03-23T18:34:58.506 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79124.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56333.log 2026-03-23T18:34:58.506 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79124.log.gz 2026-03-23T18:34:58.506 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.26462.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26462.log.gz 2026-03-23T18:34:58.506 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62348.log 2026-03-23T18:34:58.506 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56333.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66683.log 2026-03-23T18:34:58.506 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56333.log.gz 2026-03-23T18:34:58.506 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62348.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62348.log.gz 2026-03-23T18:34:58.507 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65321.log 2026-03-23T18:34:58.507 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66683.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73387.log 2026-03-23T18:34:58.507 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66683.log.gz 2026-03-23T18:34:58.507 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65321.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65321.log.gz 2026-03-23T18:34:58.507 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65575.log 2026-03-23T18:34:58.507 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73387.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51304.log 2026-03-23T18:34:58.507 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73387.log.gz 2026-03-23T18:34:58.507 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65575.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65575.log.gz 2026-03-23T18:34:58.508 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29527.log 2026-03-23T18:34:58.508 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.51304.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81180.log 2026-03-23T18:34:58.508 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51304.log.gz 2026-03-23T18:34:58.508 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29527.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29527.log.gz 2026-03-23T18:34:58.508 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37535.log 2026-03-23T18:34:58.508 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81180.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49471.log 2026-03-23T18:34:58.508 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81180.log.gz 2026-03-23T18:34:58.509 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37535.log: gzip -5 --verbose 26.2% -- /var/log/ceph/ceph-client.admin.36107.log -- replaced with /var/log/ceph/ceph-client.admin.37535.log.gz 2026-03-23T18:34:58.509 INFO:teuthology.orchestra.run.vm04.stderr: 2026-03-23T18:34:58.509 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49471.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49471.log.gz 2026-03-23T18:34:58.509 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82957.log 2026-03-23T18:34:58.509 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36107.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30278.log 2026-03-23T18:34:58.509 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36107.log.gz 2026-03-23T18:34:58.509 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82957.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82957.log.gz 2026-03-23T18:34:58.510 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73538.log 2026-03-23T18:34:58.510 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.30278.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73086.log 2026-03-23T18:34:58.510 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30278.log.gz 2026-03-23T18:34:58.510 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73538.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73538.log.gz 2026-03-23T18:34:58.510 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59351.log 2026-03-23T18:34:58.510 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73086.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28220.log 2026-03-23T18:34:58.510 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73086.log.gz 2026-03-23T18:34:58.510 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59351.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59351.log.gz 2026-03-23T18:34:58.511 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90325.log 2026-03-23T18:34:58.511 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.28220.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75379.log 2026-03-23T18:34:58.511 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28220.log.gz 2026-03-23T18:34:58.511 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90325.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90325.log.gz 2026-03-23T18:34:58.511 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74121.log 2026-03-23T18:34:58.511 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75379.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28005.log 2026-03-23T18:34:58.511 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75379.log.gz 2026-03-23T18:34:58.511 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74121.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74121.log.gz 2026-03-23T18:34:58.512 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72807.log 2026-03-23T18:34:58.512 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.28005.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85383.log 2026-03-23T18:34:58.512 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28005.log.gz 2026-03-23T18:34:58.512 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72807.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72807.log.gz 2026-03-23T18:34:58.512 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89315.log 2026-03-23T18:34:58.512 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85383.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48464.log 2026-03-23T18:34:58.512 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85383.log.gz 2026-03-23T18:34:58.512 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89315.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89315.log.gz 2026-03-23T18:34:58.513 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36549.log 2026-03-23T18:34:58.513 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.48464.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29505.log 2026-03-23T18:34:58.513 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48464.log.gz 2026-03-23T18:34:58.513 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36549.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36549.log.gz 2026-03-23T18:34:58.513 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40214.log 2026-03-23T18:34:58.513 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29505.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83068.log 2026-03-23T18:34:58.513 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29505.log.gz 2026-03-23T18:34:58.514 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.40214.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40214.log.gz 2026-03-23T18:34:58.514 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66243.log 2026-03-23T18:34:58.514 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83068.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30697.log 2026-03-23T18:34:58.514 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83068.log.gz 2026-03-23T18:34:58.514 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37892.log 2026-03-23T18:34:58.515 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66243.log: /var/log/ceph/ceph-client.admin.30697.log: 58.4% -- replaced with /var/log/ceph/ceph-client.admin.66243.log.gz 2026-03-23T18:34:58.515 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30697.log.gz 2026-03-23T18:34:58.515 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57181.log 2026-03-23T18:34:58.515 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37892.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79887.log 2026-03-23T18:34:58.515 INFO:teuthology.orchestra.run.vm04.stderr: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.37892.log.gz 2026-03-23T18:34:58.515 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.57181.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57181.log.gz 2026-03-23T18:34:58.515 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61015.log 2026-03-23T18:34:58.516 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79887.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64890.log 2026-03-23T18:34:58.516 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79887.log.gz 2026-03-23T18:34:58.516 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61015.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61015.log.gz 2026-03-23T18:34:58.516 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38144.log 2026-03-23T18:34:58.516 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64890.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78812.log 2026-03-23T18:34:58.516 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64890.log.gz 2026-03-23T18:34:58.516 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.38144.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38144.log.gz 2026-03-23T18:34:58.516 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76325.log 2026-03-23T18:34:58.517 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78812.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74229.log 2026-03-23T18:34:58.517 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78812.log.gz 2026-03-23T18:34:58.517 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76325.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76325.log.gz 2026-03-23T18:34:58.517 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40900.log 2026-03-23T18:34:58.517 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74229.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86281.log 2026-03-23T18:34:58.517 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74229.log.gz 2026-03-23T18:34:58.517 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.40900.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40900.log.gz 2026-03-23T18:34:58.517 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29612.log 2026-03-23T18:34:58.518 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86281.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57224.log 2026-03-23T18:34:58.518 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86281.log.gz 2026-03-23T18:34:58.518 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29612.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29612.log.gz 2026-03-23T18:34:58.518 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36821.log 2026-03-23T18:34:58.518 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.57224.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55795.log 2026-03-23T18:34:58.518 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36821.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36821.log.gz 2026-03-23T18:34:58.518 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57224.log.gz 2026-03-23T18:34:58.519 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25862.log 2026-03-23T18:34:58.519 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.55795.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36362.log 2026-03-23T18:34:58.519 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55795.log.gz 2026-03-23T18:34:58.519 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.25862.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25862.log.gz 2026-03-23T18:34:58.519 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81374.log 2026-03-23T18:34:58.519 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36362.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82376.log 2026-03-23T18:34:58.520 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36362.log.gz 2026-03-23T18:34:58.520 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81374.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81374.log.gz 2026-03-23T18:34:58.520 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61746.log 2026-03-23T18:34:58.520 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82376.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57671.log 2026-03-23T18:34:58.520 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82376.log.gz 2026-03-23T18:34:58.520 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61746.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61746.log.gz 2026-03-23T18:34:58.520 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66130.log 2026-03-23T18:34:58.520 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.57671.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37076.log 2026-03-23T18:34:58.520 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57671.log.gz 2026-03-23T18:34:58.521 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66130.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66130.log.gz 2026-03-23T18:34:58.521 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70965.log 2026-03-23T18:34:58.521 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37076.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61359.log 2026-03-23T18:34:58.521 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37076.log.gz 2026-03-23T18:34:58.521 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70965.log: 55.7% -- replaced with /var/log/ceph/ceph-client.admin.70965.log.gz 2026-03-23T18:34:58.521 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46513.log 2026-03-23T18:34:58.521 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61359.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61359.log.gz 2026-03-23T18:34:58.522 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43195.log 2026-03-23T18:34:58.522 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46513.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71393.log 2026-03-23T18:34:58.522 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46513.log.gz 2026-03-23T18:34:58.522 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43195.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43195.log.gz 2026-03-23T18:34:58.522 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63086.log 2026-03-23T18:34:58.522 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71393.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56096.log 2026-03-23T18:34:58.522 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71393.log.gz 2026-03-23T18:34:58.523 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63086.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63086.log.gz 2026-03-23T18:34:58.523 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39913.log 2026-03-23T18:34:58.523 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56096.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68995.log 2026-03-23T18:34:58.523 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56096.log.gz 2026-03-23T18:34:58.523 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.39913.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39913.log.gz 2026-03-23T18:34:58.523 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84938.log 2026-03-23T18:34:58.523 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68995.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30042.log 2026-03-23T18:34:58.523 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68995.log.gz 2026-03-23T18:34:58.524 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84938.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84938.log.gz 2026-03-23T18:34:58.524 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56010.log 2026-03-23T18:34:58.524 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.30042.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88708.log 2026-03-23T18:34:58.524 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30042.log.gz 2026-03-23T18:34:58.524 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56010.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56010.log.gz 2026-03-23T18:34:58.524 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78476.log 2026-03-23T18:34:58.524 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88708.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68952.log 2026-03-23T18:34:58.524 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88708.log.gz 2026-03-23T18:34:58.525 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78476.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78476.log.gz 2026-03-23T18:34:58.525 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63107.log 2026-03-23T18:34:58.525 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68952.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87531.log 2026-03-23T18:34:58.525 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68952.log.gz 2026-03-23T18:34:58.525 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63107.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63107.log.gz 2026-03-23T18:34:58.525 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82095.log 2026-03-23T18:34:58.525 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87531.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36974.log 2026-03-23T18:34:58.525 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87531.log.gz 2026-03-23T18:34:58.526 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82095.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82095.log.gz 2026-03-23T18:34:58.526 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59265.log 2026-03-23T18:34:58.526 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36974.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26999.log 2026-03-23T18:34:58.526 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36974.log.gz 2026-03-23T18:34:58.526 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59265.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59265.log.gz 2026-03-23T18:34:58.526 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33796.log 2026-03-23T18:34:58.526 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.26999.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26999.log.gz 2026-03-23T18:34:58.527 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44629.log 2026-03-23T18:34:58.527 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33796.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74193.log 2026-03-23T18:34:58.527 INFO:teuthology.orchestra.run.vm04.stderr: 24.9% -- replaced with /var/log/ceph/ceph-client.admin.33796.log.gz 2026-03-23T18:34:58.527 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44629.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44629.log.gz 2026-03-23T18:34:58.527 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47024.log 2026-03-23T18:34:58.528 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74193.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56849.log 2026-03-23T18:34:58.528 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74193.log.gz 2026-03-23T18:34:58.528 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.47024.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49923.log 2026-03-23T18:34:58.528 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47024.log.gz 2026-03-23T18:34:58.528 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56849.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56849.log.gz 2026-03-23T18:34:58.528 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78830.log 2026-03-23T18:34:58.528 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49923.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61058.log 2026-03-23T18:34:58.528 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49923.log.gz 2026-03-23T18:34:58.528 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78830.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78830.log.gz 2026-03-23T18:34:58.529 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39655.log 2026-03-23T18:34:58.529 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61058.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73879.log 2026-03-23T18:34:58.529 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61058.log.gz 2026-03-23T18:34:58.529 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.39655.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39655.log.gz 2026-03-23T18:34:58.529 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59330.log 2026-03-23T18:34:58.530 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73879.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85185.log 2026-03-23T18:34:58.530 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73879.log.gz 2026-03-23T18:34:58.530 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59330.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59330.log.gz 2026-03-23T18:34:58.530 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33606.log 2026-03-23T18:34:58.530 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85185.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44730.log 2026-03-23T18:34:58.530 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85185.log.gz 2026-03-23T18:34:58.530 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33606.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33606.log.gz 2026-03-23T18:34:58.531 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57099.log 2026-03-23T18:34:58.531 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44730.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66898.log 2026-03-23T18:34:58.531 INFO:teuthology.orchestra.run.vm04.stderr: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.44730.log.gz 2026-03-23T18:34:58.531 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.57099.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57099.log.gz 2026-03-23T18:34:58.531 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64654.log 2026-03-23T18:34:58.532 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66898.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28392.log 2026-03-23T18:34:58.532 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66898.log.gz 2026-03-23T18:34:58.532 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64654.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64654.log.gz 2026-03-23T18:34:58.532 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64140.log 2026-03-23T18:34:58.532 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.28392.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36872.log 2026-03-23T18:34:58.532 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28392.log.gz 2026-03-23T18:34:58.532 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64140.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64140.log.gz 2026-03-23T18:34:58.532 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67500.log 2026-03-23T18:34:58.533 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36872.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81503.log 2026-03-23T18:34:58.533 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36872.log.gz 2026-03-23T18:34:58.533 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67500.log: 27.1% -- replaced with /var/log/ceph/ceph-client.admin.67500.log.gz 2026-03-23T18:34:58.533 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42906.log 2026-03-23T18:34:58.533 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81503.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87638.log 2026-03-23T18:34:58.533 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81503.log.gz 2026-03-23T18:34:58.533 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42906.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42906.log.gz 2026-03-23T18:34:58.533 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48080.log 2026-03-23T18:34:58.534 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87638.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87552.log 2026-03-23T18:34:58.534 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87638.log.gz 2026-03-23T18:34:58.534 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.48080.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48080.log.gz 2026-03-23T18:34:58.534 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69232.log 2026-03-23T18:34:58.534 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87552.log: gzip -5 --verbose -- 0.0% /var/log/ceph/ceph-client.admin.73826.log 2026-03-23T18:34:58.534 INFO:teuthology.orchestra.run.vm04.stderr: -- replaced with /var/log/ceph/ceph-client.admin.87552.log.gz 2026-03-23T18:34:58.534 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69232.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60745.log 2026-03-23T18:34:58.534 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69232.log.gz 2026-03-23T18:34:58.535 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73826.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73826.log.gz 2026-03-23T18:34:58.535 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78895.log 2026-03-23T18:34:58.535 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60745.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42822.log 2026-03-23T18:34:58.535 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60745.log.gz 2026-03-23T18:34:58.535 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78895.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78895.log.gz 2026-03-23T18:34:58.535 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55302.log 2026-03-23T18:34:58.536 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42822.log: 25.7% -- replaced with /var/log/ceph/ceph-client.admin.42822.log.gz 2026-03-23T18:34:58.536 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84379.log 2026-03-23T18:34:58.536 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.55302.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77828.log 2026-03-23T18:34:58.536 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55302.log.gz 2026-03-23T18:34:58.536 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84379.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84379.log.gz 2026-03-23T18:34:58.536 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82290.log 2026-03-23T18:34:58.537 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77828.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77828.log.gz 2026-03-23T18:34:58.537 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82184.log 2026-03-23T18:34:58.537 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82290.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29247.log 2026-03-23T18:34:58.537 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82290.log.gz 2026-03-23T18:34:58.537 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82184.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82184.log.gz 2026-03-23T18:34:58.537 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68221.log 2026-03-23T18:34:58.538 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29247.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34888.log 2026-03-23T18:34:58.538 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29247.log.gz 2026-03-23T18:34:58.538 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68221.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68221.log.gz 2026-03-23T18:34:58.538 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38694.log 2026-03-23T18:34:58.538 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34888.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74909.log 2026-03-23T18:34:58.538 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34888.log.gz 2026-03-23T18:34:58.538 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84852.log 2026-03-23T18:34:58.539 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.38694.log: /var/log/ceph/ceph-client.admin.74909.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74909.log.gz 2026-03-23T18:34:58.539 INFO:teuthology.orchestra.run.vm04.stderr: 26.6% -- replaced with /var/log/ceph/ceph-client.admin.38694.log.gz 2026-03-23T18:34:58.539 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40996.log 2026-03-23T18:34:58.539 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84852.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81481.log 2026-03-23T18:34:58.539 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84852.log.gz 2026-03-23T18:34:58.539 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.40996.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40996.log.gz 2026-03-23T18:34:58.540 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88430.log 2026-03-23T18:34:58.540 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81481.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57800.log 2026-03-23T18:34:58.540 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81481.log.gz 2026-03-23T18:34:58.540 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88430.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88430.log.gz 2026-03-23T18:34:58.540 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78258.log 2026-03-23T18:34:58.540 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.57800.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36787.log 2026-03-23T18:34:58.540 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57800.log.gz 2026-03-23T18:34:58.541 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78258.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78258.log.gz 2026-03-23T18:34:58.541 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72132.log 2026-03-23T18:34:58.541 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36787.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54604.log 2026-03-23T18:34:58.541 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36787.log.gz 2026-03-23T18:34:58.541 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72132.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57628.log 2026-03-23T18:34:58.541 INFO:teuthology.orchestra.run.vm04.stderr: 88.7% -- replaced with /var/log/ceph/ceph-client.admin.72132.log.gz 2026-03-23T18:34:58.541 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.54604.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54604.log.gz 2026-03-23T18:34:58.541 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77742.log 2026-03-23T18:34:58.542 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.57628.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62154.log 2026-03-23T18:34:58.542 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57628.log.gz 2026-03-23T18:34:58.542 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77742.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77742.log.gz 2026-03-23T18:34:58.542 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85805.log 2026-03-23T18:34:58.542 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62154.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62154.log.gz 2026-03-23T18:34:58.543 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43589.log 2026-03-23T18:34:58.543 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85805.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26354.log 2026-03-23T18:34:58.543 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85805.log.gz 2026-03-23T18:34:58.543 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43589.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79960.log 2026-03-23T18:34:58.543 INFO:teuthology.orchestra.run.vm04.stderr: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.43589.log.gz 2026-03-23T18:34:58.543 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.26354.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26354.log.gz 2026-03-23T18:34:58.543 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86106.log 2026-03-23T18:34:58.544 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79960.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86947.log 2026-03-23T18:34:58.544 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79960.log.gz 2026-03-23T18:34:58.544 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86106.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86106.log.gz 2026-03-23T18:34:58.544 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71458.log 2026-03-23T18:34:58.544 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86947.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75447.log 2026-03-23T18:34:58.544 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86947.log.gz 2026-03-23T18:34:58.545 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71458.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71458.log.gz 2026-03-23T18:34:58.545 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71802.log 2026-03-23T18:34:58.545 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75447.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67602.log 2026-03-23T18:34:58.545 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75447.log.gz 2026-03-23T18:34:58.545 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71802.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71802.log.gz 2026-03-23T18:34:58.546 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78941.log 2026-03-23T18:34:58.546 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67602.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70818.log 2026-03-23T18:34:58.546 INFO:teuthology.orchestra.run.vm04.stderr: 54.8% -- replaced with /var/log/ceph/ceph-client.admin.67602.log.gz 2026-03-23T18:34:58.546 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78941.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78941.log.gz 2026-03-23T18:34:58.546 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67252.log 2026-03-23T18:34:58.546 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70818.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36294.log 2026-03-23T18:34:58.546 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70818.log.gz 2026-03-23T18:34:58.547 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67252.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41485.log 2026-03-23T18:34:58.547 INFO:teuthology.orchestra.run.vm04.stderr: 56.0% -- replaced with /var/log/ceph/ceph-client.admin.67252.log.gz 2026-03-23T18:34:58.547 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36294.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36294.log.gz 2026-03-23T18:34:58.547 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72511.log 2026-03-23T18:34:58.547 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.41485.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68974.log 2026-03-23T18:34:58.547 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41485.log.gz 2026-03-23T18:34:58.547 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72511.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72511.log.gz 2026-03-23T18:34:58.547 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68845.log 2026-03-23T18:34:58.548 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68974.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56462.log 2026-03-23T18:34:58.548 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68974.log.gz 2026-03-23T18:34:58.548 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68845.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68845.log.gz 2026-03-23T18:34:58.548 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37682.log 2026-03-23T18:34:58.548 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56462.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91327.log 2026-03-23T18:34:58.548 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56462.log.gz 2026-03-23T18:34:58.548 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37682.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77656.log 2026-03-23T18:34:58.548 INFO:teuthology.orchestra.run.vm04.stderr: 25.4% -- replaced with /var/log/ceph/ceph-client.admin.37682.log.gz 2026-03-23T18:34:58.549 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91327.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91327.log.gz 2026-03-23T18:34:58.549 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45473.log 2026-03-23T18:34:58.549 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77656.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41421.log 2026-03-23T18:34:58.549 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77656.log.gz 2026-03-23T18:34:58.549 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.45473.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45473.log.gz 2026-03-23T18:34:58.549 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61488.log 2026-03-23T18:34:58.550 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.41421.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42544.log 2026-03-23T18:34:58.550 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41421.log.gz 2026-03-23T18:34:58.550 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61488.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61488.log.gz 2026-03-23T18:34:58.550 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67856.log 2026-03-23T18:34:58.550 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42544.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70237.log 2026-03-23T18:34:58.550 INFO:teuthology.orchestra.run.vm04.stderr: 26.0%/var/log/ceph/ceph-client.admin.67856.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67856.log.gz 2026-03-23T18:34:58.550 INFO:teuthology.orchestra.run.vm04.stderr: -- replaced with /var/log/ceph/ceph-client.admin.42544.log.gz 2026-03-23T18:34:58.551 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55429.log 2026-03-23T18:34:58.551 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70237.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55408.log 2026-03-23T18:34:58.551 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70237.log.gz 2026-03-23T18:34:58.551 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.55429.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34825.log 2026-03-23T18:34:58.551 INFO:teuthology.orchestra.run.vm04.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.55429.log.gz 2026-03-23T18:34:58.551 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.55408.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55408.log.gz 2026-03-23T18:34:58.552 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74702.log 2026-03-23T18:34:58.552 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34825.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91567.log 2026-03-23T18:34:58.552 INFO:teuthology.orchestra.run.vm04.stderr: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.34825.log.gz 2026-03-23T18:34:58.552 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74702.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74702.log.gz 2026-03-23T18:34:58.552 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36260.log 2026-03-23T18:34:58.552 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91567.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38358.log 2026-03-23T18:34:58.553 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91567.log.gz 2026-03-23T18:34:58.553 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36260.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36260.log.gz 2026-03-23T18:34:58.553 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36498.log 2026-03-23T18:34:58.553 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.38358.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58711.log 2026-03-23T18:34:58.553 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36498.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36498.log.gz 2026-03-23T18:34:58.553 INFO:teuthology.orchestra.run.vm04.stderr: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.38358.log.gz 2026-03-23T18:34:58.553 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78598.log 2026-03-23T18:34:58.554 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58711.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43153.log 2026-03-23T18:34:58.554 INFO:teuthology.orchestra.run.vm04.stderr: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.58711.log.gz 2026-03-23T18:34:58.554 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78598.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78598.log.gz 2026-03-23T18:34:58.554 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84052.log 2026-03-23T18:34:58.554 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43153.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40085.log 2026-03-23T18:34:58.554 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43153.log.gz 2026-03-23T18:34:58.554 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84052.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84052.log.gz 2026-03-23T18:34:58.554 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76196.log 2026-03-23T18:34:58.555 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.40085.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88158.log 2026-03-23T18:34:58.555 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40085.log.gz 2026-03-23T18:34:58.555 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76196.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76196.log.gz 2026-03-23T18:34:58.555 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26419.log 2026-03-23T18:34:58.555 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88158.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59652.log 2026-03-23T18:34:58.555 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88158.log.gz 2026-03-23T18:34:58.555 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.26419.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26419.log.gz 2026-03-23T18:34:58.555 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75091.log 2026-03-23T18:34:58.556 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59652.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49664.log 2026-03-23T18:34:58.556 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59652.log.gz 2026-03-23T18:34:58.556 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75091.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75091.log.gz 2026-03-23T18:34:58.556 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61531.log 2026-03-23T18:34:58.556 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49664.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84723.log 2026-03-23T18:34:58.556 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49664.log.gz 2026-03-23T18:34:58.556 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61531.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61531.log.gz 2026-03-23T18:34:58.556 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89100.log 2026-03-23T18:34:58.557 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84723.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70880.log 2026-03-23T18:34:58.557 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84723.log.gz 2026-03-23T18:34:58.557 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89100.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89100.log.gz 2026-03-23T18:34:58.557 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33079.log 2026-03-23T18:34:58.557 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70880.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43007.log 2026-03-23T18:34:58.557 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70880.log.gz 2026-03-23T18:34:58.557 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33079.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33079.log.gz 2026-03-23T18:34:58.557 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77123.log 2026-03-23T18:34:58.558 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43007.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68823.log 2026-03-23T18:34:58.558 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43007.log.gz 2026-03-23T18:34:58.558 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77123.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77123.log.gz 2026-03-23T18:34:58.558 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49902.log 2026-03-23T18:34:58.558 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68823.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73065.log 2026-03-23T18:34:58.558 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68823.log.gz 2026-03-23T18:34:58.559 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49902.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49902.log.gz 2026-03-23T18:34:58.559 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89164.log 2026-03-23T18:34:58.559 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73065.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73473.log 2026-03-23T18:34:58.559 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73065.log.gz 2026-03-23T18:34:58.559 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89164.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89164.log.gz 2026-03-23T18:34:58.559 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42161.log 2026-03-23T18:34:58.560 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73473.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64221.log 2026-03-23T18:34:58.560 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73473.log.gz 2026-03-23T18:34:58.560 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42161.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42161.log.gz 2026-03-23T18:34:58.560 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74460.log 2026-03-23T18:34:58.560 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64221.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62585.log 2026-03-23T18:34:58.560 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64221.log.gz 2026-03-23T18:34:58.560 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74460.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74460.log.gz 2026-03-23T18:34:58.560 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90282.log 2026-03-23T18:34:58.561 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62585.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28954.log 2026-03-23T18:34:58.561 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62585.log.gz 2026-03-23T18:34:58.561 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90282.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90282.log.gz 2026-03-23T18:34:58.561 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32756.log 2026-03-23T18:34:58.561 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.28954.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32115.log 2026-03-23T18:34:58.561 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28954.log.gz 2026-03-23T18:34:58.562 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32756.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32756.log.gz 2026-03-23T18:34:58.562 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54170.log 2026-03-23T18:34:58.562 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32115.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47141.log 2026-03-23T18:34:58.562 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32115.log.gz 2026-03-23T18:34:58.562 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.54170.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54170.log.gz 2026-03-23T18:34:58.562 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49085.log 2026-03-23T18:34:58.563 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.47141.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54871.log 2026-03-23T18:34:58.563 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47141.log.gz 2026-03-23T18:34:58.563 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49085.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42966.log 2026-03-23T18:34:58.563 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49085.log.gz 2026-03-23T18:34:58.563 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.54871.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54871.log.gz 2026-03-23T18:34:58.563 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40465.log 2026-03-23T18:34:58.563 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42966.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85977.log 2026-03-23T18:34:58.563 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42966.log.gz 2026-03-23T18:34:58.564 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.40465.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40465.log.gz 2026-03-23T18:34:58.564 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28795.log 2026-03-23T18:34:58.564 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85977.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63592.log 2026-03-23T18:34:58.564 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85977.log.gz 2026-03-23T18:34:58.564 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.28795.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28795.log.gz 2026-03-23T18:34:58.565 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67729.log 2026-03-23T18:34:58.565 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63592.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84465.log 2026-03-23T18:34:58.565 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63592.log.gz 2026-03-23T18:34:58.565 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67729.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67729.log.gz 2026-03-23T18:34:58.565 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57140.log 2026-03-23T18:34:58.565 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84465.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86840.log 2026-03-23T18:34:58.566 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84465.log.gz 2026-03-23T18:34:58.566 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.57140.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57140.log.gz 2026-03-23T18:34:58.566 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64297.log 2026-03-23T18:34:58.566 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86840.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29183.log 2026-03-23T18:34:58.566 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86840.log.gz 2026-03-23T18:34:58.567 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64297.log: 54.9% -- replaced with /var/log/ceph/ceph-client.admin.64297.log.gz 2026-03-23T18:34:58.567 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30926.log 2026-03-23T18:34:58.567 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29183.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83837.log 2026-03-23T18:34:58.567 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29183.log.gz 2026-03-23T18:34:58.567 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.30926.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30926.log.gz 2026-03-23T18:34:58.567 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56527.log 2026-03-23T18:34:58.567 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83837.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67400.log 2026-03-23T18:34:58.568 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83837.log.gz 2026-03-23T18:34:58.568 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56527.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56527.log.gz 2026-03-23T18:34:58.568 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37493.log 2026-03-23T18:34:58.568 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67400.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66597.log 2026-03-23T18:34:58.568 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67400.log.gz 2026-03-23T18:34:58.568 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37493.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.37493.log.gz 2026-03-23T18:34:58.569 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72752.log 2026-03-23T18:34:58.569 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66597.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51777.log 2026-03-23T18:34:58.569 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66597.log.gz 2026-03-23T18:34:58.569 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72752.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72752.log.gz 2026-03-23T18:34:58.569 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56117.log 2026-03-23T18:34:58.570 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.51777.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66790.log 2026-03-23T18:34:58.570 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51777.log.gz 2026-03-23T18:34:58.570 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56117.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56117.log.gz 2026-03-23T18:34:58.570 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25940.log 2026-03-23T18:34:58.570 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66790.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34258.log 2026-03-23T18:34:58.570 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66790.log.gz 2026-03-23T18:34:58.570 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.25940.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25940.log.gz 2026-03-23T18:34:58.571 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44010.log 2026-03-23T18:34:58.571 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34258.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49406.log 2026-03-23T18:34:58.571 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44010.log: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.34258.log.gz 2026-03-23T18:34:58.571 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44010.log.gz 2026-03-23T18:34:58.571 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55079.log 2026-03-23T18:34:58.571 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49406.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75787.log 2026-03-23T18:34:58.571 INFO:teuthology.orchestra.run.vm04.stderr: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.49406.log.gz 2026-03-23T18:34:58.572 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.55079.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55079.log.gz 2026-03-23T18:34:58.572 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66446.log 2026-03-23T18:34:58.572 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75787.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55100.log 2026-03-23T18:34:58.572 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75787.log.gz 2026-03-23T18:34:58.572 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66446.log: gzip 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66446.log.gz -5 2026-03-23T18:34:58.572 INFO:teuthology.orchestra.run.vm04.stderr: --verbose -- /var/log/ceph/ceph-client.admin.43484.log 2026-03-23T18:34:58.572 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.55100.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55100.log.gz 2026-03-23T18:34:58.573 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56952.log 2026-03-23T18:34:58.573 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43484.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28832.log 2026-03-23T18:34:58.573 INFO:teuthology.orchestra.run.vm04.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.43484.log.gz 2026-03-23T18:34:58.573 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56952.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56952.log.gz 2026-03-23T18:34:58.573 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53675.log 2026-03-23T18:34:58.573 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.28832.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38841.log 2026-03-23T18:34:58.573 INFO:teuthology.orchestra.run.vm04.stderr: 5.3% -- replaced with /var/log/ceph/ceph-client.admin.28832.log.gz 2026-03-23T18:34:58.574 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.53675.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53675.log.gz 2026-03-23T18:34:58.574 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57735.log 2026-03-23T18:34:58.574 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.38841.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47700.log 2026-03-23T18:34:58.574 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38841.log.gz 2026-03-23T18:34:58.574 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.57735.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64318.log 2026-03-23T18:34:58.574 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57735.log.gz 2026-03-23T18:34:58.574 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.47700.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47700.log.gz 2026-03-23T18:34:58.574 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62133.log 2026-03-23T18:34:58.575 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64318.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64318.log.gz 2026-03-23T18:34:58.575 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58363.log 2026-03-23T18:34:58.575 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62133.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41442.log 2026-03-23T18:34:58.575 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62133.log.gz 2026-03-23T18:34:58.575 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58363.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58363.log.gz 2026-03-23T18:34:58.575 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31529.log 2026-03-23T18:34:58.576 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.41442.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61208.log 2026-03-23T18:34:58.576 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41442.log.gz 2026-03-23T18:34:58.576 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31529.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77785.log 2026-03-23T18:34:58.576 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31529.log.gz 2026-03-23T18:34:58.576 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61208.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61208.log.gz 2026-03-23T18:34:58.576 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67172.log 2026-03-23T18:34:58.577 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77785.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40484.log 2026-03-23T18:34:58.577 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77785.log.gz 2026-03-23T18:34:58.577 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67172.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49858.log 2026-03-23T18:34:58.577 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67172.log.gz 2026-03-23T18:34:58.577 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.40484.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40484.log.gz 2026-03-23T18:34:58.577 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56053.log 2026-03-23T18:34:58.577 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49858.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34153.log 2026-03-23T18:34:58.577 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49858.log.gz 2026-03-23T18:34:58.577 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56053.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56053.log.gz 2026-03-23T18:34:58.577 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81094.log 2026-03-23T18:34:58.578 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34153.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54809.log 2026-03-23T18:34:58.578 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81094.log: 27.0% -- replaced with /var/log/ceph/ceph-client.admin.34153.log.gz 2026-03-23T18:34:58.578 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81094.log.gz 2026-03-23T18:34:58.578 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51948.log 2026-03-23T18:34:58.578 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36532.log 2026-03-23T18:34:58.579 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.54809.log: /var/log/ceph/ceph-client.admin.51948.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54809.log.gz 2026-03-23T18:34:58.579 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51948.log.gz 2026-03-23T18:34:58.579 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85504.log 2026-03-23T18:34:58.579 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36532.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85719.log 2026-03-23T18:34:58.579 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36532.log.gz 2026-03-23T18:34:58.579 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85504.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85504.log.gz 2026-03-23T18:34:58.579 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36753.log 2026-03-23T18:34:58.580 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85719.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58203.log 2026-03-23T18:34:58.580 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85719.log.gz 2026-03-23T18:34:58.580 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36753.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68393.log 2026-03-23T18:34:58.580 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36753.log.gz 2026-03-23T18:34:58.580 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58203.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58203.log.gz 2026-03-23T18:34:58.580 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44330.log 2026-03-23T18:34:58.581 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68393.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69604.log 2026-03-23T18:34:58.581 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68393.log.gz 2026-03-23T18:34:58.581 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44330.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45581.log 2026-03-23T18:34:58.581 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44330.log.gz 2026-03-23T18:34:58.581 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69604.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69604.log.gz 2026-03-23T18:34:58.581 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63570.log 2026-03-23T18:34:58.581 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.45581.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82010.log 2026-03-23T18:34:58.581 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45581.log.gz 2026-03-23T18:34:58.582 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63570.log: 59.0% -- replaced with /var/log/ceph/ceph-client.admin.63570.log.gz 2026-03-23T18:34:58.582 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27239.log 2026-03-23T18:34:58.582 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82010.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70345.log 2026-03-23T18:34:58.582 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82010.log.gz 2026-03-23T18:34:58.582 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27239.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27239.log.gz 2026-03-23T18:34:58.583 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36515.log 2026-03-23T18:34:58.583 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70345.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86732.log 2026-03-23T18:34:58.583 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70345.log.gz 2026-03-23T18:34:58.583 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36515.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37093.log 2026-03-23T18:34:58.583 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36515.log.gz 2026-03-23T18:34:58.583 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86732.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86732.log.gz 2026-03-23T18:34:58.584 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42037.log 2026-03-23T18:34:58.584 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.37093.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56994.log 2026-03-23T18:34:58.584 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37093.log.gz 2026-03-23T18:34:58.584 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42037.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82148.log 2026-03-23T18:34:58.584 INFO:teuthology.orchestra.run.vm04.stderr: 19.1% -- replaced with /var/log/ceph/ceph-client.admin.42037.log.gz 2026-03-23T18:34:58.584 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56994.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56994.log.gz 2026-03-23T18:34:58.584 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71157.log 2026-03-23T18:34:58.585 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82148.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75549.log 2026-03-23T18:34:58.585 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82148.log.gz 2026-03-23T18:34:58.585 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71157.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71157.log.gz 2026-03-23T18:34:58.585 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42663.log 2026-03-23T18:34:58.585 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75549.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65516.log 2026-03-23T18:34:58.585 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75549.log.gz 2026-03-23T18:34:58.585 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42663.log: 59.0% -- replaced with /var/log/ceph/ceph-client.admin.42663.log.gz 2026-03-23T18:34:58.586 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41141.log 2026-03-23T18:34:58.586 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65516.log: gzip -5 --verbose -- /var/log/ceph/ceph-osd.2.log 2026-03-23T18:34:58.586 INFO:teuthology.orchestra.run.vm04.stderr: 57.9% -- replaced with /var/log/ceph/ceph-client.admin.65516.log.gz 2026-03-23T18:34:58.586 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.41141.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41141.log.gz 2026-03-23T18:34:58.586 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50031.log 2026-03-23T18:34:58.586 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83794.log 2026-03-23T18:34:58.587 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-osd.2.log: /var/log/ceph/ceph-client.admin.50031.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50031.log.gz 2026-03-23T18:34:58.594 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88478.log 2026-03-23T18:34:58.595 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83794.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60376.log 2026-03-23T18:34:58.595 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83794.log.gz 2026-03-23T18:34:58.595 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88478.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88478.log.gz 2026-03-23T18:34:58.595 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89874.log 2026-03-23T18:34:58.595 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60376.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60376.log.gz 2026-03-23T18:34:58.596 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71823.log 2026-03-23T18:34:58.596 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89874.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89874.log.gz 2026-03-23T18:34:58.596 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55838.log 2026-03-23T18:34:58.597 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71823.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71823.log.gz 2026-03-23T18:34:58.597 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73808.log 2026-03-23T18:34:58.597 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.55838.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55838.log.gz 2026-03-23T18:34:58.597 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44351.log 2026-03-23T18:34:58.597 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73808.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73808.log.gz 2026-03-23T18:34:58.598 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54768.log 2026-03-23T18:34:58.598 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44351.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.44351.log.gz 2026-03-23T18:34:58.598 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85547.log 2026-03-23T18:34:58.599 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.54768.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54768.log.gz 2026-03-23T18:34:58.599 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69476.log 2026-03-23T18:34:58.599 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85547.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85547.log.gz 2026-03-23T18:34:58.599 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42605.log 2026-03-23T18:34:58.600 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69476.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69476.log.gz 2026-03-23T18:34:58.600 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27200.log 2026-03-23T18:34:58.600 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.42605.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42605.log.gz 2026-03-23T18:34:58.600 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43093.log 2026-03-23T18:34:58.601 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.27200.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27200.log.gz 2026-03-23T18:34:58.601 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58036.log 2026-03-23T18:34:58.601 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.43093.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43093.log.gz 2026-03-23T18:34:58.601 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74612.log 2026-03-23T18:34:58.601 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58036.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58036.log.gz 2026-03-23T18:34:58.602 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69665.log 2026-03-23T18:34:58.602 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74612.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74612.log.gz 2026-03-23T18:34:58.602 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38736.log 2026-03-23T18:34:58.603 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69665.log: 13.0% -- replaced with /var/log/ceph/ceph-client.admin.69665.log.gz 2026-03-23T18:34:58.603 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77893.log 2026-03-23T18:34:58.603 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.38736.log: 25.4% -- replaced with /var/log/ceph/ceph-client.admin.38736.log.gz 2026-03-23T18:34:58.603 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29462.log 2026-03-23T18:34:58.604 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77893.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77893.log.gz 2026-03-23T18:34:58.604 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29956.log 2026-03-23T18:34:58.604 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29462.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29462.log.gz 2026-03-23T18:34:58.604 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70495.log 2026-03-23T18:34:58.605 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29956.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29956.log.gz 2026-03-23T18:34:58.605 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61982.log 2026-03-23T18:34:58.605 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70495.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70495.log.gz 2026-03-23T18:34:58.605 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28564.log 2026-03-23T18:34:58.606 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61982.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61982.log.gz 2026-03-23T18:34:58.606 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44831.log 2026-03-23T18:34:58.606 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.28564.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28564.log.gz 2026-03-23T18:34:58.606 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55184.log 2026-03-23T18:34:58.607 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44831.log: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.44831.log.gz 2026-03-23T18:34:58.607 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34069.log 2026-03-23T18:34:58.607 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.55184.log: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.55184.log.gz 2026-03-23T18:34:58.607 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32030.log 2026-03-23T18:34:58.608 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.34069.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.34069.log.gz 2026-03-23T18:34:58.608 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56570.log 2026-03-23T18:34:58.608 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32030.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32030.log.gz 2026-03-23T18:34:58.608 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75567.log 2026-03-23T18:34:58.609 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56570.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56570.log.gz 2026-03-23T18:34:58.609 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36226.log 2026-03-23T18:34:58.609 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75567.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75567.log.gz 2026-03-23T18:34:58.609 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28920.log 2026-03-23T18:34:58.610 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36226.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36226.log.gz 2026-03-23T18:34:58.610 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25788.log 2026-03-23T18:34:58.610 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.28920.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28920.log.gz 2026-03-23T18:34:58.610 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33943.log 2026-03-23T18:34:58.611 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.25788.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25788.log.gz 2026-03-23T18:34:58.611 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26333.log 2026-03-23T18:34:58.611 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.33943.log: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.33943.log.gz 2026-03-23T18:34:58.611 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36481.log 2026-03-23T18:34:58.612 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.26333.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26333.log.gz 2026-03-23T18:34:58.612 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50590.log 2026-03-23T18:34:58.612 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.36481.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36481.log.gz 2026-03-23T18:34:58.612 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62219.log 2026-03-23T18:34:58.613 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50590.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50590.log.gz 2026-03-23T18:34:58.613 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59050.log 2026-03-23T18:34:58.613 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62219.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62219.log.gz 2026-03-23T18:34:58.613 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53847.log 2026-03-23T18:34:58.614 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59050.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59050.log.gz 2026-03-23T18:34:58.614 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64394.log 2026-03-23T18:34:58.614 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.53847.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53847.log.gz 2026-03-23T18:34:58.614 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90447.log 2026-03-23T18:34:58.615 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64394.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64394.log.gz 2026-03-23T18:34:58.615 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39935.log 2026-03-23T18:34:58.615 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90447.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90447.log.gz 2026-03-23T18:34:58.615 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32252.log 2026-03-23T18:34:58.616 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.39935.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39935.log.gz 2026-03-23T18:34:58.616 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77764.log 2026-03-23T18:34:58.616 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32252.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32252.log.gz 2026-03-23T18:34:58.616 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25784.log 2026-03-23T18:34:58.617 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77764.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77764.log.gz 2026-03-23T18:34:58.617 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90406.log 2026-03-23T18:34:58.617 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.25784.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25784.log.gz 2026-03-23T18:34:58.617 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-mon.a.log 2026-03-23T18:34:58.618 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90406.log: 56.7% -- replaced with /var/log/ceph/ceph-client.admin.90406.log.gz 2026-03-23T18:34:58.618 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67985.log 2026-03-23T18:34:58.627 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-mon.a.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66068.log 2026-03-23T18:34:58.627 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67985.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67985.log.gz 2026-03-23T18:34:58.634 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51153.log 2026-03-23T18:34:58.635 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66068.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66068.log.gz 2026-03-23T18:34:58.646 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45108.log 2026-03-23T18:34:58.647 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.51153.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.51153.log.gz 2026-03-23T18:34:58.654 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58165.log 2026-03-23T18:34:58.655 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.45108.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45108.log.gz 2026-03-23T18:34:58.662 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88330.log 2026-03-23T18:34:58.663 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58165.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58165.log.gz 2026-03-23T18:34:58.674 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69081.log 2026-03-23T18:34:58.675 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88330.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88330.log.gz 2026-03-23T18:34:58.686 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49729.log 2026-03-23T18:34:58.687 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69081.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69081.log.gz 2026-03-23T18:34:58.698 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55730.log 2026-03-23T18:34:58.699 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.49729.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49729.log.gz 2026-03-23T18:34:58.706 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57692.log 2026-03-23T18:34:58.714 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.55730.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.55730.log.gz 2026-03-23T18:34:58.714 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89831.log 2026-03-23T18:34:58.715 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.57692.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57692.log.gz 2026-03-23T18:34:58.722 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31078.log 2026-03-23T18:34:58.723 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89831.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89831.log.gz 2026-03-23T18:34:58.730 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89229.log 2026-03-23T18:34:58.731 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.31078.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31078.log.gz 2026-03-23T18:34:58.746 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46793.log 2026-03-23T18:34:58.747 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89229.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89229.log.gz 2026-03-23T18:34:58.754 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44047.log 2026-03-23T18:34:58.755 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.46793.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46793.log.gz 2026-03-23T18:34:58.762 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69770.log 2026-03-23T18:34:58.763 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.44047.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44047.log.gz 2026-03-23T18:34:58.770 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61402.log 2026-03-23T18:34:58.778 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69770.log: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.69770.log.gz 2026-03-23T18:34:58.786 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48756.log 2026-03-23T18:34:58.787 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61402.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61402.log.gz 2026-03-23T18:34:58.802 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77166.log 2026-03-23T18:34:58.803 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.48756.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48756.log.gz 2026-03-23T18:34:58.810 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87703.log 2026-03-23T18:34:58.811 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77166.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77166.log.gz 2026-03-23T18:34:58.818 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50756.log 2026-03-23T18:34:58.819 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87703.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87703.log.gz 2026-03-23T18:34:58.826 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60120.log 2026-03-23T18:34:58.827 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50756.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50756.log.gz 2026-03-23T18:34:58.847 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28628.log 2026-03-23T18:34:58.847 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60120.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60120.log.gz 2026-03-23T18:34:58.854 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41335.log 2026-03-23T18:34:58.855 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.28628.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28628.log.gz 2026-03-23T18:34:58.862 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74537.log 2026-03-23T18:34:58.863 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.41335.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41335.log.gz 2026-03-23T18:34:58.870 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67314.log 2026-03-23T18:34:58.871 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74537.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74537.log.gz 2026-03-23T18:34:58.882 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50547.log 2026-03-23T18:34:58.883 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67314.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67314.log.gz 2026-03-23T18:34:58.890 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29441.log 2026-03-23T18:34:58.891 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.50547.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50547.log.gz 2026-03-23T18:34:58.898 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32337.log 2026-03-23T18:34:58.899 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.29441.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29441.log.gz 2026-03-23T18:34:58.906 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64257.log 2026-03-23T18:34:58.907 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.32337.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32337.log.gz 2026-03-23T18:34:58.923 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64257.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64257.log.gz 2026-03-23T18:34:59.597 INFO:teuthology.orchestra.run.vm04.stderr: 91.4% -- replaced with /var/log/ceph/ceph-mon.a.log.gz 2026-03-23T18:35:09.779 INFO:teuthology.orchestra.run.vm04.stderr: 94.0% -- replaced with /var/log/ceph/ceph-osd.2.log.gz 2026-03-23T18:35:14.375 INFO:teuthology.orchestra.run.vm04.stderr: 94.1% -- replaced with /var/log/ceph/ceph-osd.0.log.gz 2026-03-23T18:35:15.832 INFO:teuthology.orchestra.run.vm04.stderr: 94.2% -- replaced with /var/log/ceph/ceph-osd.1.log.gz 2026-03-23T18:35:15.832 INFO:teuthology.orchestra.run.vm04.stderr: 2026-03-23T18:35:15.833 INFO:teuthology.orchestra.run.vm04.stderr:real 0m18.140s 2026-03-23T18:35:15.833 INFO:teuthology.orchestra.run.vm04.stderr:user 0m44.070s 2026-03-23T18:35:15.833 INFO:teuthology.orchestra.run.vm04.stderr:sys 0m3.847s 2026-03-23T18:35:15.833 INFO:tasks.ceph:Archiving logs... 2026-03-23T18:35:15.833 DEBUG:teuthology.misc:Transferring archived files from vm04:/var/log/ceph to /archive/kyr-2026-03-20_22:04:26-rbd-tentacle-none-default-vps/3501/remote/vm04/log 2026-03-23T18:35:15.833 DEBUG:teuthology.orchestra.run.vm04:> sudo tar c -f - -C /var/log/ceph -- . 2026-03-23T18:35:18.087 DEBUG:teuthology.run_tasks:Unwinding manager install 2026-03-23T18:35:18.089 INFO:teuthology.task.install.util:Removing shipped files: /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer... 2026-03-23T18:35:18.090 DEBUG:teuthology.orchestra.run.vm04:> sudo rm -f -- /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer 2026-03-23T18:35:18.145 INFO:teuthology.task.install.deb:Removing packages: ceph, cephadm, ceph-mds, ceph-mgr, ceph-common, ceph-fuse, ceph-test, ceph-volume, radosgw, python3-rados, python3-rgw, python3-cephfs, python3-rbd, libcephfs2, libcephfs-dev, librados2, librbd1, rbd-fuse on Debian system. 2026-03-23T18:35:18.146 DEBUG:teuthology.orchestra.run.vm04:> for d in ceph cephadm ceph-mds ceph-mgr ceph-common ceph-fuse ceph-test ceph-volume radosgw python3-rados python3-rgw python3-cephfs python3-rbd libcephfs2 libcephfs-dev librados2 librbd1 rbd-fuse ; do sudo DEBIAN_FRONTEND=noninteractive apt-get -y --force-yes -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" purge $d || true ; done 2026-03-23T18:35:18.241 INFO:teuthology.orchestra.run.vm04.stdout:Reading package lists... 2026-03-23T18:35:18.369 INFO:teuthology.orchestra.run.vm04.stdout:Building dependency tree... 2026-03-23T18:35:18.369 INFO:teuthology.orchestra.run.vm04.stdout:Reading state information... 2026-03-23T18:35:18.467 INFO:teuthology.orchestra.run.vm04.stdout:The following packages were automatically installed and are no longer required: 2026-03-23T18:35:18.467 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mon kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-23T18:35:18.467 INFO:teuthology.orchestra.run.vm04.stdout: libsgutils2-2 sg3-utils sg3-utils-udev 2026-03-23T18:35:18.467 INFO:teuthology.orchestra.run.vm04.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-23T18:35:18.483 INFO:teuthology.orchestra.run.vm04.stdout:The following packages will be REMOVED: 2026-03-23T18:35:18.484 INFO:teuthology.orchestra.run.vm04.stdout: ceph* 2026-03-23T18:35:18.638 INFO:teuthology.orchestra.run.vm04.stdout:0 upgraded, 0 newly installed, 1 to remove and 37 not upgraded. 2026-03-23T18:35:18.638 INFO:teuthology.orchestra.run.vm04.stdout:After this operation, 47.1 kB disk space will be freed. 2026-03-23T18:35:18.685 INFO:teuthology.orchestra.run.vm04.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 126150 files and directories currently installed.) 2026-03-23T18:35:18.687 INFO:teuthology.orchestra.run.vm04.stdout:Removing ceph (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T18:35:19.606 INFO:teuthology.orchestra.run.vm04.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-23T18:35:19.641 INFO:teuthology.orchestra.run.vm04.stdout:Reading package lists... 2026-03-23T18:35:19.773 INFO:teuthology.orchestra.run.vm04.stdout:Building dependency tree... 2026-03-23T18:35:19.773 INFO:teuthology.orchestra.run.vm04.stdout:Reading state information... 2026-03-23T18:35:19.875 INFO:teuthology.orchestra.run.vm04.stdout:The following packages were automatically installed and are no longer required: 2026-03-23T18:35:19.875 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mon kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-23T18:35:19.875 INFO:teuthology.orchestra.run.vm04.stdout: libsgutils2-2 python-asyncssh-doc python3-asyncssh sg3-utils sg3-utils-udev 2026-03-23T18:35:19.875 INFO:teuthology.orchestra.run.vm04.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-23T18:35:19.883 INFO:teuthology.orchestra.run.vm04.stdout:The following packages will be REMOVED: 2026-03-23T18:35:19.884 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-cephadm* cephadm* 2026-03-23T18:35:20.030 INFO:teuthology.orchestra.run.vm04.stdout:0 upgraded, 0 newly installed, 2 to remove and 37 not upgraded. 2026-03-23T18:35:20.030 INFO:teuthology.orchestra.run.vm04.stdout:After this operation, 2177 kB disk space will be freed. 2026-03-23T18:35:20.065 INFO:teuthology.orchestra.run.vm04.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 126148 files and directories currently installed.) 2026-03-23T18:35:20.067 INFO:teuthology.orchestra.run.vm04.stdout:Removing ceph-mgr-cephadm (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T18:35:20.078 INFO:teuthology.orchestra.run.vm04.stdout:dpkg: warning: while removing ceph-mgr-cephadm, directory '/usr/share/ceph/mgr/cephadm/services' not empty so not removed 2026-03-23T18:35:20.086 INFO:teuthology.orchestra.run.vm04.stdout:Removing cephadm (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T18:35:20.115 INFO:teuthology.orchestra.run.vm04.stdout:Looking for files to backup/remove ... 2026-03-23T18:35:20.116 INFO:teuthology.orchestra.run.vm04.stdout:Not backing up/removing `/var/lib/cephadm', it matches ^/var/.*. 2026-03-23T18:35:20.118 INFO:teuthology.orchestra.run.vm04.stdout:Removing user `cephadm' ... 2026-03-23T18:35:20.118 INFO:teuthology.orchestra.run.vm04.stdout:Warning: group `nogroup' has no more members. 2026-03-23T18:35:20.146 INFO:teuthology.orchestra.run.vm04.stdout:Done. 2026-03-23T18:35:20.168 INFO:teuthology.orchestra.run.vm04.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-23T18:35:20.261 INFO:teuthology.orchestra.run.vm04.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 126062 files and directories currently installed.) 2026-03-23T18:35:20.262 INFO:teuthology.orchestra.run.vm04.stdout:Purging configuration files for cephadm (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T18:35:21.132 INFO:teuthology.orchestra.run.vm04.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-23T18:35:21.166 INFO:teuthology.orchestra.run.vm04.stdout:Reading package lists... 2026-03-23T18:35:21.296 INFO:teuthology.orchestra.run.vm04.stdout:Building dependency tree... 2026-03-23T18:35:21.297 INFO:teuthology.orchestra.run.vm04.stdout:Reading state information... 2026-03-23T18:35:21.396 INFO:teuthology.orchestra.run.vm04.stdout:The following packages were automatically installed and are no longer required: 2026-03-23T18:35:21.396 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mon kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-23T18:35:21.396 INFO:teuthology.orchestra.run.vm04.stdout: libsgutils2-2 python-asyncssh-doc python3-asyncssh sg3-utils sg3-utils-udev 2026-03-23T18:35:21.396 INFO:teuthology.orchestra.run.vm04.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-23T18:35:21.404 INFO:teuthology.orchestra.run.vm04.stdout:The following packages will be REMOVED: 2026-03-23T18:35:21.405 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mds* 2026-03-23T18:35:21.550 INFO:teuthology.orchestra.run.vm04.stdout:0 upgraded, 0 newly installed, 1 to remove and 37 not upgraded. 2026-03-23T18:35:21.550 INFO:teuthology.orchestra.run.vm04.stdout:After this operation, 6851 kB disk space will be freed. 2026-03-23T18:35:21.585 INFO:teuthology.orchestra.run.vm04.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 126062 files and directories currently installed.) 2026-03-23T18:35:21.587 INFO:teuthology.orchestra.run.vm04.stdout:Removing ceph-mds (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T18:35:21.973 INFO:teuthology.orchestra.run.vm04.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-23T18:35:22.062 INFO:teuthology.orchestra.run.vm04.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 126054 files and directories currently installed.) 2026-03-23T18:35:22.063 INFO:teuthology.orchestra.run.vm04.stdout:Purging configuration files for ceph-mds (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T18:35:23.323 INFO:teuthology.orchestra.run.vm04.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-23T18:35:23.357 INFO:teuthology.orchestra.run.vm04.stdout:Reading package lists... 2026-03-23T18:35:23.490 INFO:teuthology.orchestra.run.vm04.stdout:Building dependency tree... 2026-03-23T18:35:23.490 INFO:teuthology.orchestra.run.vm04.stdout:Reading state information... 2026-03-23T18:35:23.594 INFO:teuthology.orchestra.run.vm04.stdout:The following packages were automatically installed and are no longer required: 2026-03-23T18:35:23.594 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-modules-core ceph-mon kpartx libboost-iostreams1.74.0 2026-03-23T18:35:23.595 INFO:teuthology.orchestra.run.vm04.stdout: libboost-thread1.74.0 libpmemobj1 libsgutils2-2 python-asyncssh-doc 2026-03-23T18:35:23.595 INFO:teuthology.orchestra.run.vm04.stdout: python3-asyncssh python3-cachetools python3-cheroot python3-cherrypy3 2026-03-23T18:35:23.595 INFO:teuthology.orchestra.run.vm04.stdout: python3-google-auth python3-jaraco.classes python3-jaraco.collections 2026-03-23T18:35:23.595 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-23T18:35:23.595 INFO:teuthology.orchestra.run.vm04.stdout: python3-kubernetes python3-natsort python3-portend python3-psutil 2026-03-23T18:35:23.595 INFO:teuthology.orchestra.run.vm04.stdout: python3-repoze.lru python3-requests-oauthlib python3-routes python3-rsa 2026-03-23T18:35:23.595 INFO:teuthology.orchestra.run.vm04.stdout: python3-simplejson python3-sklearn python3-sklearn-lib python3-tempora 2026-03-23T18:35:23.595 INFO:teuthology.orchestra.run.vm04.stdout: python3-threadpoolctl python3-webob python3-websocket python3-zc.lockfile 2026-03-23T18:35:23.595 INFO:teuthology.orchestra.run.vm04.stdout: sg3-utils sg3-utils-udev 2026-03-23T18:35:23.595 INFO:teuthology.orchestra.run.vm04.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-23T18:35:23.603 INFO:teuthology.orchestra.run.vm04.stdout:The following packages will be REMOVED: 2026-03-23T18:35:23.603 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr* ceph-mgr-dashboard* ceph-mgr-diskprediction-local* 2026-03-23T18:35:23.603 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-k8sevents* 2026-03-23T18:35:23.749 INFO:teuthology.orchestra.run.vm04.stdout:0 upgraded, 0 newly installed, 4 to remove and 37 not upgraded. 2026-03-23T18:35:23.749 INFO:teuthology.orchestra.run.vm04.stdout:After this operation, 219 MB disk space will be freed. 2026-03-23T18:35:23.784 INFO:teuthology.orchestra.run.vm04.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 126054 files and directories currently installed.) 2026-03-23T18:35:23.786 INFO:teuthology.orchestra.run.vm04.stdout:Removing ceph-mgr-k8sevents (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T18:35:23.798 INFO:teuthology.orchestra.run.vm04.stdout:Removing ceph-mgr-diskprediction-local (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T18:35:23.810 INFO:teuthology.orchestra.run.vm04.stdout:dpkg: warning: while removing ceph-mgr-diskprediction-local, directory '/usr/share/ceph/mgr/diskprediction_local' not empty so not removed 2026-03-23T18:35:23.819 INFO:teuthology.orchestra.run.vm04.stdout:Removing ceph-mgr-dashboard (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T18:35:23.876 INFO:teuthology.orchestra.run.vm04.stdout:dpkg: warning: while removing ceph-mgr-dashboard, directory '/usr/share/ceph/mgr/dashboard/services/auth' not empty so not removed 2026-03-23T18:35:23.876 INFO:teuthology.orchestra.run.vm04.stdout:dpkg: warning: while removing ceph-mgr-dashboard, directory '/usr/share/ceph/mgr/dashboard/plugins' not empty so not removed 2026-03-23T18:35:23.876 INFO:teuthology.orchestra.run.vm04.stdout:dpkg: warning: while removing ceph-mgr-dashboard, directory '/usr/share/ceph/mgr/dashboard/model' not empty so not removed 2026-03-23T18:35:23.876 INFO:teuthology.orchestra.run.vm04.stdout:dpkg: warning: while removing ceph-mgr-dashboard, directory '/usr/share/ceph/mgr/dashboard/controllers' not empty so not removed 2026-03-23T18:35:23.876 INFO:teuthology.orchestra.run.vm04.stdout:dpkg: warning: while removing ceph-mgr-dashboard, directory '/usr/share/ceph/mgr/dashboard/api' not empty so not removed 2026-03-23T18:35:23.884 INFO:teuthology.orchestra.run.vm04.stdout:Removing ceph-mgr (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T18:35:24.317 INFO:teuthology.orchestra.run.vm04.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 124272 files and directories currently installed.) 2026-03-23T18:35:24.319 INFO:teuthology.orchestra.run.vm04.stdout:Purging configuration files for ceph-mgr (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T18:35:24.682 INFO:teuthology.orchestra.run.vm04.stdout:dpkg: warning: while removing ceph-mgr, directory '/var/lib/ceph/mgr' not empty so not removed 2026-03-23T18:35:25.569 INFO:teuthology.orchestra.run.vm04.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-23T18:35:25.604 INFO:teuthology.orchestra.run.vm04.stdout:Reading package lists... 2026-03-23T18:35:25.741 INFO:teuthology.orchestra.run.vm04.stdout:Building dependency tree... 2026-03-23T18:35:25.741 INFO:teuthology.orchestra.run.vm04.stdout:Reading state information... 2026-03-23T18:35:25.844 INFO:teuthology.orchestra.run.vm04.stdout:The following packages were automatically installed and are no longer required: 2026-03-23T18:35:25.844 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-23T18:35:25.844 INFO:teuthology.orchestra.run.vm04.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-23T18:35:25.844 INFO:teuthology.orchestra.run.vm04.stdout: libsgutils2-2 libsqlite3-mod-ceph nvme-cli python-asyncssh-doc 2026-03-23T18:35:25.844 INFO:teuthology.orchestra.run.vm04.stdout: python3-asyncssh python3-cachetools python3-ceph-common python3-cheroot 2026-03-23T18:35:25.844 INFO:teuthology.orchestra.run.vm04.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-23T18:35:25.844 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-23T18:35:25.844 INFO:teuthology.orchestra.run.vm04.stdout: python3-joblib python3-kubernetes python3-natsort python3-portend 2026-03-23T18:35:25.844 INFO:teuthology.orchestra.run.vm04.stdout: python3-prettytable python3-psutil python3-repoze.lru 2026-03-23T18:35:25.844 INFO:teuthology.orchestra.run.vm04.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplejson 2026-03-23T18:35:25.844 INFO:teuthology.orchestra.run.vm04.stdout: python3-sklearn python3-sklearn-lib python3-tempora python3-threadpoolctl 2026-03-23T18:35:25.844 INFO:teuthology.orchestra.run.vm04.stdout: python3-wcwidth python3-webob python3-websocket python3-zc.lockfile 2026-03-23T18:35:25.844 INFO:teuthology.orchestra.run.vm04.stdout: sg3-utils sg3-utils-udev smartmontools socat xmlstarlet 2026-03-23T18:35:25.844 INFO:teuthology.orchestra.run.vm04.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-23T18:35:25.853 INFO:teuthology.orchestra.run.vm04.stdout:The following packages will be REMOVED: 2026-03-23T18:35:25.853 INFO:teuthology.orchestra.run.vm04.stdout: ceph-base* ceph-common* ceph-mon* ceph-osd* ceph-test* ceph-volume* radosgw* 2026-03-23T18:35:25.998 INFO:teuthology.orchestra.run.vm04.stdout:0 upgraded, 0 newly installed, 7 to remove and 37 not upgraded. 2026-03-23T18:35:25.998 INFO:teuthology.orchestra.run.vm04.stdout:After this operation, 732 MB disk space will be freed. 2026-03-23T18:35:26.032 INFO:teuthology.orchestra.run.vm04.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 124271 files and directories currently installed.) 2026-03-23T18:35:26.034 INFO:teuthology.orchestra.run.vm04.stdout:Removing ceph-volume (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T18:35:26.096 INFO:teuthology.orchestra.run.vm04.stdout:Removing ceph-osd (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T18:35:26.480 INFO:teuthology.orchestra.run.vm04.stdout:Removing ceph-mon (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T18:35:26.895 INFO:teuthology.orchestra.run.vm04.stdout:Removing ceph-base (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T18:35:27.266 INFO:teuthology.orchestra.run.vm04.stdout:Removing radosgw (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T18:35:27.672 INFO:teuthology.orchestra.run.vm04.stdout:Removing ceph-test (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T18:35:27.696 INFO:teuthology.orchestra.run.vm04.stdout:Removing ceph-common (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T18:35:28.106 INFO:teuthology.orchestra.run.vm04.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-23T18:35:28.144 INFO:teuthology.orchestra.run.vm04.stdout:Processing triggers for libc-bin (2.35-0ubuntu3.13) ... 2026-03-23T18:35:28.211 INFO:teuthology.orchestra.run.vm04.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 123780 files and directories currently installed.) 2026-03-23T18:35:28.212 INFO:teuthology.orchestra.run.vm04.stdout:Purging configuration files for radosgw (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T18:35:28.730 INFO:teuthology.orchestra.run.vm04.stdout:Purging configuration files for ceph-mon (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T18:35:29.072 INFO:teuthology.orchestra.run.vm04.stdout:dpkg: warning: while removing ceph-mon, directory '/var/lib/ceph/mon' not empty so not removed 2026-03-23T18:35:29.082 INFO:teuthology.orchestra.run.vm04.stdout:Purging configuration files for ceph-base (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T18:35:29.450 INFO:teuthology.orchestra.run.vm04.stdout:Purging configuration files for ceph-common (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T18:35:29.925 INFO:teuthology.orchestra.run.vm04.stdout:Purging configuration files for ceph-osd (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T18:35:30.258 INFO:teuthology.orchestra.run.vm04.stdout:dpkg: warning: while removing ceph-osd, directory '/var/lib/ceph/osd' not empty so not removed 2026-03-23T18:35:31.118 INFO:teuthology.orchestra.run.vm04.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-23T18:35:31.153 INFO:teuthology.orchestra.run.vm04.stdout:Reading package lists... 2026-03-23T18:35:31.284 INFO:teuthology.orchestra.run.vm04.stdout:Building dependency tree... 2026-03-23T18:35:31.284 INFO:teuthology.orchestra.run.vm04.stdout:Reading state information... 2026-03-23T18:35:31.383 INFO:teuthology.orchestra.run.vm04.stdout:The following packages were automatically installed and are no longer required: 2026-03-23T18:35:31.383 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-23T18:35:31.383 INFO:teuthology.orchestra.run.vm04.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-23T18:35:31.383 INFO:teuthology.orchestra.run.vm04.stdout: libsgutils2-2 libsqlite3-mod-ceph nvme-cli python-asyncssh-doc 2026-03-23T18:35:31.383 INFO:teuthology.orchestra.run.vm04.stdout: python3-asyncssh python3-cachetools python3-ceph-common python3-cheroot 2026-03-23T18:35:31.383 INFO:teuthology.orchestra.run.vm04.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-23T18:35:31.383 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-23T18:35:31.383 INFO:teuthology.orchestra.run.vm04.stdout: python3-joblib python3-kubernetes python3-natsort python3-portend 2026-03-23T18:35:31.384 INFO:teuthology.orchestra.run.vm04.stdout: python3-prettytable python3-psutil python3-repoze.lru 2026-03-23T18:35:31.384 INFO:teuthology.orchestra.run.vm04.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplejson 2026-03-23T18:35:31.384 INFO:teuthology.orchestra.run.vm04.stdout: python3-sklearn python3-sklearn-lib python3-tempora python3-threadpoolctl 2026-03-23T18:35:31.384 INFO:teuthology.orchestra.run.vm04.stdout: python3-wcwidth python3-webob python3-websocket python3-zc.lockfile 2026-03-23T18:35:31.384 INFO:teuthology.orchestra.run.vm04.stdout: sg3-utils sg3-utils-udev smartmontools socat xmlstarlet 2026-03-23T18:35:31.384 INFO:teuthology.orchestra.run.vm04.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-23T18:35:31.392 INFO:teuthology.orchestra.run.vm04.stdout:The following packages will be REMOVED: 2026-03-23T18:35:31.393 INFO:teuthology.orchestra.run.vm04.stdout: ceph-fuse* 2026-03-23T18:35:31.535 INFO:teuthology.orchestra.run.vm04.stdout:0 upgraded, 0 newly installed, 1 to remove and 37 not upgraded. 2026-03-23T18:35:31.535 INFO:teuthology.orchestra.run.vm04.stdout:After this operation, 2932 kB disk space will be freed. 2026-03-23T18:35:31.569 INFO:teuthology.orchestra.run.vm04.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 123764 files and directories currently installed.) 2026-03-23T18:35:31.570 INFO:teuthology.orchestra.run.vm04.stdout:Removing ceph-fuse (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T18:35:31.946 INFO:teuthology.orchestra.run.vm04.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-23T18:35:32.035 INFO:teuthology.orchestra.run.vm04.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 123755 files and directories currently installed.) 2026-03-23T18:35:32.036 INFO:teuthology.orchestra.run.vm04.stdout:Purging configuration files for ceph-fuse (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T18:35:33.258 INFO:teuthology.orchestra.run.vm04.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-23T18:35:33.292 INFO:teuthology.orchestra.run.vm04.stdout:Reading package lists... 2026-03-23T18:35:33.438 INFO:teuthology.orchestra.run.vm04.stdout:Building dependency tree... 2026-03-23T18:35:33.438 INFO:teuthology.orchestra.run.vm04.stdout:Reading state information... 2026-03-23T18:35:33.538 INFO:teuthology.orchestra.run.vm04.stdout:Package 'ceph-test' is not installed, so not removed 2026-03-23T18:35:33.538 INFO:teuthology.orchestra.run.vm04.stdout:The following packages were automatically installed and are no longer required: 2026-03-23T18:35:33.538 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-23T18:35:33.538 INFO:teuthology.orchestra.run.vm04.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-23T18:35:33.538 INFO:teuthology.orchestra.run.vm04.stdout: libsgutils2-2 libsqlite3-mod-ceph nvme-cli python-asyncssh-doc 2026-03-23T18:35:33.538 INFO:teuthology.orchestra.run.vm04.stdout: python3-asyncssh python3-cachetools python3-ceph-common python3-cheroot 2026-03-23T18:35:33.538 INFO:teuthology.orchestra.run.vm04.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-23T18:35:33.538 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-23T18:35:33.538 INFO:teuthology.orchestra.run.vm04.stdout: python3-joblib python3-kubernetes python3-natsort python3-portend 2026-03-23T18:35:33.538 INFO:teuthology.orchestra.run.vm04.stdout: python3-prettytable python3-psutil python3-repoze.lru 2026-03-23T18:35:33.538 INFO:teuthology.orchestra.run.vm04.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplejson 2026-03-23T18:35:33.538 INFO:teuthology.orchestra.run.vm04.stdout: python3-sklearn python3-sklearn-lib python3-tempora python3-threadpoolctl 2026-03-23T18:35:33.538 INFO:teuthology.orchestra.run.vm04.stdout: python3-wcwidth python3-webob python3-websocket python3-zc.lockfile 2026-03-23T18:35:33.538 INFO:teuthology.orchestra.run.vm04.stdout: sg3-utils sg3-utils-udev smartmontools socat xmlstarlet 2026-03-23T18:35:33.538 INFO:teuthology.orchestra.run.vm04.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-23T18:35:33.553 INFO:teuthology.orchestra.run.vm04.stdout:0 upgraded, 0 newly installed, 0 to remove and 37 not upgraded. 2026-03-23T18:35:33.553 INFO:teuthology.orchestra.run.vm04.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-23T18:35:33.586 INFO:teuthology.orchestra.run.vm04.stdout:Reading package lists... 2026-03-23T18:35:33.722 INFO:teuthology.orchestra.run.vm04.stdout:Building dependency tree... 2026-03-23T18:35:33.723 INFO:teuthology.orchestra.run.vm04.stdout:Reading state information... 2026-03-23T18:35:33.822 INFO:teuthology.orchestra.run.vm04.stdout:Package 'ceph-volume' is not installed, so not removed 2026-03-23T18:35:33.822 INFO:teuthology.orchestra.run.vm04.stdout:The following packages were automatically installed and are no longer required: 2026-03-23T18:35:33.822 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-23T18:35:33.822 INFO:teuthology.orchestra.run.vm04.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-23T18:35:33.822 INFO:teuthology.orchestra.run.vm04.stdout: libsgutils2-2 libsqlite3-mod-ceph nvme-cli python-asyncssh-doc 2026-03-23T18:35:33.822 INFO:teuthology.orchestra.run.vm04.stdout: python3-asyncssh python3-cachetools python3-ceph-common python3-cheroot 2026-03-23T18:35:33.822 INFO:teuthology.orchestra.run.vm04.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-23T18:35:33.822 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-23T18:35:33.822 INFO:teuthology.orchestra.run.vm04.stdout: python3-joblib python3-kubernetes python3-natsort python3-portend 2026-03-23T18:35:33.822 INFO:teuthology.orchestra.run.vm04.stdout: python3-prettytable python3-psutil python3-repoze.lru 2026-03-23T18:35:33.822 INFO:teuthology.orchestra.run.vm04.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplejson 2026-03-23T18:35:33.822 INFO:teuthology.orchestra.run.vm04.stdout: python3-sklearn python3-sklearn-lib python3-tempora python3-threadpoolctl 2026-03-23T18:35:33.822 INFO:teuthology.orchestra.run.vm04.stdout: python3-wcwidth python3-webob python3-websocket python3-zc.lockfile 2026-03-23T18:35:33.822 INFO:teuthology.orchestra.run.vm04.stdout: sg3-utils sg3-utils-udev smartmontools socat xmlstarlet 2026-03-23T18:35:33.822 INFO:teuthology.orchestra.run.vm04.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-23T18:35:33.838 INFO:teuthology.orchestra.run.vm04.stdout:0 upgraded, 0 newly installed, 0 to remove and 37 not upgraded. 2026-03-23T18:35:33.838 INFO:teuthology.orchestra.run.vm04.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-23T18:35:33.870 INFO:teuthology.orchestra.run.vm04.stdout:Reading package lists... 2026-03-23T18:35:34.002 INFO:teuthology.orchestra.run.vm04.stdout:Building dependency tree... 2026-03-23T18:35:34.002 INFO:teuthology.orchestra.run.vm04.stdout:Reading state information... 2026-03-23T18:35:34.101 INFO:teuthology.orchestra.run.vm04.stdout:Package 'radosgw' is not installed, so not removed 2026-03-23T18:35:34.101 INFO:teuthology.orchestra.run.vm04.stdout:The following packages were automatically installed and are no longer required: 2026-03-23T18:35:34.101 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-23T18:35:34.101 INFO:teuthology.orchestra.run.vm04.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-23T18:35:34.101 INFO:teuthology.orchestra.run.vm04.stdout: libsgutils2-2 libsqlite3-mod-ceph nvme-cli python-asyncssh-doc 2026-03-23T18:35:34.101 INFO:teuthology.orchestra.run.vm04.stdout: python3-asyncssh python3-cachetools python3-ceph-common python3-cheroot 2026-03-23T18:35:34.102 INFO:teuthology.orchestra.run.vm04.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-23T18:35:34.102 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-23T18:35:34.102 INFO:teuthology.orchestra.run.vm04.stdout: python3-joblib python3-kubernetes python3-natsort python3-portend 2026-03-23T18:35:34.102 INFO:teuthology.orchestra.run.vm04.stdout: python3-prettytable python3-psutil python3-repoze.lru 2026-03-23T18:35:34.102 INFO:teuthology.orchestra.run.vm04.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplejson 2026-03-23T18:35:34.102 INFO:teuthology.orchestra.run.vm04.stdout: python3-sklearn python3-sklearn-lib python3-tempora python3-threadpoolctl 2026-03-23T18:35:34.102 INFO:teuthology.orchestra.run.vm04.stdout: python3-wcwidth python3-webob python3-websocket python3-zc.lockfile 2026-03-23T18:35:34.102 INFO:teuthology.orchestra.run.vm04.stdout: sg3-utils sg3-utils-udev smartmontools socat xmlstarlet 2026-03-23T18:35:34.102 INFO:teuthology.orchestra.run.vm04.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-23T18:35:34.117 INFO:teuthology.orchestra.run.vm04.stdout:0 upgraded, 0 newly installed, 0 to remove and 37 not upgraded. 2026-03-23T18:35:34.117 INFO:teuthology.orchestra.run.vm04.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-23T18:35:34.150 INFO:teuthology.orchestra.run.vm04.stdout:Reading package lists... 2026-03-23T18:35:34.287 INFO:teuthology.orchestra.run.vm04.stdout:Building dependency tree... 2026-03-23T18:35:34.288 INFO:teuthology.orchestra.run.vm04.stdout:Reading state information... 2026-03-23T18:35:34.387 INFO:teuthology.orchestra.run.vm04.stdout:The following packages were automatically installed and are no longer required: 2026-03-23T18:35:34.387 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-23T18:35:34.387 INFO:teuthology.orchestra.run.vm04.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-23T18:35:34.387 INFO:teuthology.orchestra.run.vm04.stdout: librdkafka1 librgw2 libsgutils2-2 libsqlite3-mod-ceph nvme-cli 2026-03-23T18:35:34.387 INFO:teuthology.orchestra.run.vm04.stdout: python-asyncssh-doc python3-asyncssh python3-cachetools 2026-03-23T18:35:34.387 INFO:teuthology.orchestra.run.vm04.stdout: python3-ceph-argparse python3-ceph-common python3-cheroot python3-cherrypy3 2026-03-23T18:35:34.387 INFO:teuthology.orchestra.run.vm04.stdout: python3-google-auth python3-jaraco.classes python3-jaraco.collections 2026-03-23T18:35:34.387 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-23T18:35:34.387 INFO:teuthology.orchestra.run.vm04.stdout: python3-kubernetes python3-natsort python3-portend python3-prettytable 2026-03-23T18:35:34.387 INFO:teuthology.orchestra.run.vm04.stdout: python3-psutil python3-repoze.lru python3-requests-oauthlib python3-routes 2026-03-23T18:35:34.387 INFO:teuthology.orchestra.run.vm04.stdout: python3-rsa python3-simplejson python3-sklearn python3-sklearn-lib 2026-03-23T18:35:34.387 INFO:teuthology.orchestra.run.vm04.stdout: python3-tempora python3-threadpoolctl python3-wcwidth python3-webob 2026-03-23T18:35:34.387 INFO:teuthology.orchestra.run.vm04.stdout: python3-websocket python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools 2026-03-23T18:35:34.387 INFO:teuthology.orchestra.run.vm04.stdout: socat xmlstarlet 2026-03-23T18:35:34.387 INFO:teuthology.orchestra.run.vm04.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-23T18:35:34.396 INFO:teuthology.orchestra.run.vm04.stdout:The following packages will be REMOVED: 2026-03-23T18:35:34.396 INFO:teuthology.orchestra.run.vm04.stdout: python3-cephfs* python3-rados* python3-rgw* 2026-03-23T18:35:34.538 INFO:teuthology.orchestra.run.vm04.stdout:0 upgraded, 0 newly installed, 3 to remove and 37 not upgraded. 2026-03-23T18:35:34.538 INFO:teuthology.orchestra.run.vm04.stdout:After this operation, 2086 kB disk space will be freed. 2026-03-23T18:35:34.573 INFO:teuthology.orchestra.run.vm04.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 123755 files and directories currently installed.) 2026-03-23T18:35:34.574 INFO:teuthology.orchestra.run.vm04.stdout:Removing python3-cephfs (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T18:35:34.586 INFO:teuthology.orchestra.run.vm04.stdout:Removing python3-rgw (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T18:35:34.597 INFO:teuthology.orchestra.run.vm04.stdout:Removing python3-rados (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T18:35:35.483 INFO:teuthology.orchestra.run.vm04.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-23T18:35:35.520 INFO:teuthology.orchestra.run.vm04.stdout:Reading package lists... 2026-03-23T18:35:35.653 INFO:teuthology.orchestra.run.vm04.stdout:Building dependency tree... 2026-03-23T18:35:35.653 INFO:teuthology.orchestra.run.vm04.stdout:Reading state information... 2026-03-23T18:35:35.754 INFO:teuthology.orchestra.run.vm04.stdout:Package 'python3-rgw' is not installed, so not removed 2026-03-23T18:35:35.754 INFO:teuthology.orchestra.run.vm04.stdout:The following packages were automatically installed and are no longer required: 2026-03-23T18:35:35.754 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-23T18:35:35.754 INFO:teuthology.orchestra.run.vm04.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-23T18:35:35.754 INFO:teuthology.orchestra.run.vm04.stdout: librdkafka1 librgw2 libsgutils2-2 libsqlite3-mod-ceph nvme-cli 2026-03-23T18:35:35.754 INFO:teuthology.orchestra.run.vm04.stdout: python-asyncssh-doc python3-asyncssh python3-cachetools 2026-03-23T18:35:35.754 INFO:teuthology.orchestra.run.vm04.stdout: python3-ceph-argparse python3-ceph-common python3-cheroot python3-cherrypy3 2026-03-23T18:35:35.754 INFO:teuthology.orchestra.run.vm04.stdout: python3-google-auth python3-jaraco.classes python3-jaraco.collections 2026-03-23T18:35:35.754 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-23T18:35:35.754 INFO:teuthology.orchestra.run.vm04.stdout: python3-kubernetes python3-natsort python3-portend python3-prettytable 2026-03-23T18:35:35.754 INFO:teuthology.orchestra.run.vm04.stdout: python3-psutil python3-repoze.lru python3-requests-oauthlib python3-routes 2026-03-23T18:35:35.754 INFO:teuthology.orchestra.run.vm04.stdout: python3-rsa python3-simplejson python3-sklearn python3-sklearn-lib 2026-03-23T18:35:35.754 INFO:teuthology.orchestra.run.vm04.stdout: python3-tempora python3-threadpoolctl python3-wcwidth python3-webob 2026-03-23T18:35:35.754 INFO:teuthology.orchestra.run.vm04.stdout: python3-websocket python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools 2026-03-23T18:35:35.754 INFO:teuthology.orchestra.run.vm04.stdout: socat xmlstarlet 2026-03-23T18:35:35.754 INFO:teuthology.orchestra.run.vm04.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-23T18:35:35.770 INFO:teuthology.orchestra.run.vm04.stdout:0 upgraded, 0 newly installed, 0 to remove and 37 not upgraded. 2026-03-23T18:35:35.770 INFO:teuthology.orchestra.run.vm04.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-23T18:35:35.805 INFO:teuthology.orchestra.run.vm04.stdout:Reading package lists... 2026-03-23T18:35:35.936 INFO:teuthology.orchestra.run.vm04.stdout:Building dependency tree... 2026-03-23T18:35:35.936 INFO:teuthology.orchestra.run.vm04.stdout:Reading state information... 2026-03-23T18:35:36.038 INFO:teuthology.orchestra.run.vm04.stdout:Package 'python3-cephfs' is not installed, so not removed 2026-03-23T18:35:36.038 INFO:teuthology.orchestra.run.vm04.stdout:The following packages were automatically installed and are no longer required: 2026-03-23T18:35:36.038 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-23T18:35:36.038 INFO:teuthology.orchestra.run.vm04.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-23T18:35:36.038 INFO:teuthology.orchestra.run.vm04.stdout: librdkafka1 librgw2 libsgutils2-2 libsqlite3-mod-ceph nvme-cli 2026-03-23T18:35:36.038 INFO:teuthology.orchestra.run.vm04.stdout: python-asyncssh-doc python3-asyncssh python3-cachetools 2026-03-23T18:35:36.038 INFO:teuthology.orchestra.run.vm04.stdout: python3-ceph-argparse python3-ceph-common python3-cheroot python3-cherrypy3 2026-03-23T18:35:36.038 INFO:teuthology.orchestra.run.vm04.stdout: python3-google-auth python3-jaraco.classes python3-jaraco.collections 2026-03-23T18:35:36.038 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-23T18:35:36.038 INFO:teuthology.orchestra.run.vm04.stdout: python3-kubernetes python3-natsort python3-portend python3-prettytable 2026-03-23T18:35:36.038 INFO:teuthology.orchestra.run.vm04.stdout: python3-psutil python3-repoze.lru python3-requests-oauthlib python3-routes 2026-03-23T18:35:36.038 INFO:teuthology.orchestra.run.vm04.stdout: python3-rsa python3-simplejson python3-sklearn python3-sklearn-lib 2026-03-23T18:35:36.038 INFO:teuthology.orchestra.run.vm04.stdout: python3-tempora python3-threadpoolctl python3-wcwidth python3-webob 2026-03-23T18:35:36.038 INFO:teuthology.orchestra.run.vm04.stdout: python3-websocket python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools 2026-03-23T18:35:36.038 INFO:teuthology.orchestra.run.vm04.stdout: socat xmlstarlet 2026-03-23T18:35:36.038 INFO:teuthology.orchestra.run.vm04.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-23T18:35:36.054 INFO:teuthology.orchestra.run.vm04.stdout:0 upgraded, 0 newly installed, 0 to remove and 37 not upgraded. 2026-03-23T18:35:36.054 INFO:teuthology.orchestra.run.vm04.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-23T18:35:36.088 INFO:teuthology.orchestra.run.vm04.stdout:Reading package lists... 2026-03-23T18:35:36.225 INFO:teuthology.orchestra.run.vm04.stdout:Building dependency tree... 2026-03-23T18:35:36.225 INFO:teuthology.orchestra.run.vm04.stdout:Reading state information... 2026-03-23T18:35:36.324 INFO:teuthology.orchestra.run.vm04.stdout:The following packages were automatically installed and are no longer required: 2026-03-23T18:35:36.325 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-23T18:35:36.325 INFO:teuthology.orchestra.run.vm04.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-23T18:35:36.325 INFO:teuthology.orchestra.run.vm04.stdout: librdkafka1 librgw2 libsgutils2-2 libsqlite3-mod-ceph nvme-cli 2026-03-23T18:35:36.325 INFO:teuthology.orchestra.run.vm04.stdout: python-asyncssh-doc python3-asyncssh python3-cachetools 2026-03-23T18:35:36.325 INFO:teuthology.orchestra.run.vm04.stdout: python3-ceph-argparse python3-ceph-common python3-cheroot python3-cherrypy3 2026-03-23T18:35:36.325 INFO:teuthology.orchestra.run.vm04.stdout: python3-google-auth python3-jaraco.classes python3-jaraco.collections 2026-03-23T18:35:36.325 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-23T18:35:36.325 INFO:teuthology.orchestra.run.vm04.stdout: python3-kubernetes python3-natsort python3-portend python3-prettytable 2026-03-23T18:35:36.325 INFO:teuthology.orchestra.run.vm04.stdout: python3-psutil python3-repoze.lru python3-requests-oauthlib python3-routes 2026-03-23T18:35:36.325 INFO:teuthology.orchestra.run.vm04.stdout: python3-rsa python3-simplejson python3-sklearn python3-sklearn-lib 2026-03-23T18:35:36.325 INFO:teuthology.orchestra.run.vm04.stdout: python3-tempora python3-threadpoolctl python3-wcwidth python3-webob 2026-03-23T18:35:36.325 INFO:teuthology.orchestra.run.vm04.stdout: python3-websocket python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools 2026-03-23T18:35:36.325 INFO:teuthology.orchestra.run.vm04.stdout: socat xmlstarlet 2026-03-23T18:35:36.325 INFO:teuthology.orchestra.run.vm04.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-23T18:35:36.333 INFO:teuthology.orchestra.run.vm04.stdout:The following packages will be REMOVED: 2026-03-23T18:35:36.333 INFO:teuthology.orchestra.run.vm04.stdout: python3-rbd* 2026-03-23T18:35:36.479 INFO:teuthology.orchestra.run.vm04.stdout:0 upgraded, 0 newly installed, 1 to remove and 37 not upgraded. 2026-03-23T18:35:36.479 INFO:teuthology.orchestra.run.vm04.stdout:After this operation, 1205 kB disk space will be freed. 2026-03-23T18:35:36.516 INFO:teuthology.orchestra.run.vm04.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 123731 files and directories currently installed.) 2026-03-23T18:35:36.517 INFO:teuthology.orchestra.run.vm04.stdout:Removing python3-rbd (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T18:35:37.404 INFO:teuthology.orchestra.run.vm04.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-23T18:35:37.439 INFO:teuthology.orchestra.run.vm04.stdout:Reading package lists... 2026-03-23T18:35:37.572 INFO:teuthology.orchestra.run.vm04.stdout:Building dependency tree... 2026-03-23T18:35:37.572 INFO:teuthology.orchestra.run.vm04.stdout:Reading state information... 2026-03-23T18:35:37.674 INFO:teuthology.orchestra.run.vm04.stdout:The following packages were automatically installed and are no longer required: 2026-03-23T18:35:37.674 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-23T18:35:37.674 INFO:teuthology.orchestra.run.vm04.stdout: libboost-thread1.74.0 libcephfs-proxy2 libjq1 liboath0 libonig5 libpmemobj1 2026-03-23T18:35:37.675 INFO:teuthology.orchestra.run.vm04.stdout: libradosstriper1 librdkafka1 librgw2 libsgutils2-2 libsqlite3-mod-ceph 2026-03-23T18:35:37.675 INFO:teuthology.orchestra.run.vm04.stdout: nvme-cli python-asyncssh-doc python3-asyncssh python3-cachetools 2026-03-23T18:35:37.675 INFO:teuthology.orchestra.run.vm04.stdout: python3-ceph-argparse python3-ceph-common python3-cheroot python3-cherrypy3 2026-03-23T18:35:37.675 INFO:teuthology.orchestra.run.vm04.stdout: python3-google-auth python3-jaraco.classes python3-jaraco.collections 2026-03-23T18:35:37.675 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-23T18:35:37.675 INFO:teuthology.orchestra.run.vm04.stdout: python3-kubernetes python3-natsort python3-portend python3-prettytable 2026-03-23T18:35:37.675 INFO:teuthology.orchestra.run.vm04.stdout: python3-psutil python3-repoze.lru python3-requests-oauthlib python3-routes 2026-03-23T18:35:37.675 INFO:teuthology.orchestra.run.vm04.stdout: python3-rsa python3-simplejson python3-sklearn python3-sklearn-lib 2026-03-23T18:35:37.675 INFO:teuthology.orchestra.run.vm04.stdout: python3-tempora python3-threadpoolctl python3-wcwidth python3-webob 2026-03-23T18:35:37.675 INFO:teuthology.orchestra.run.vm04.stdout: python3-websocket python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools 2026-03-23T18:35:37.675 INFO:teuthology.orchestra.run.vm04.stdout: socat xmlstarlet 2026-03-23T18:35:37.675 INFO:teuthology.orchestra.run.vm04.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-23T18:35:37.683 INFO:teuthology.orchestra.run.vm04.stdout:The following packages will be REMOVED: 2026-03-23T18:35:37.683 INFO:teuthology.orchestra.run.vm04.stdout: libcephfs-daemon* libcephfs-dev* libcephfs2* 2026-03-23T18:35:37.826 INFO:teuthology.orchestra.run.vm04.stdout:0 upgraded, 0 newly installed, 3 to remove and 37 not upgraded. 2026-03-23T18:35:37.826 INFO:teuthology.orchestra.run.vm04.stdout:After this operation, 2851 kB disk space will be freed. 2026-03-23T18:35:37.860 INFO:teuthology.orchestra.run.vm04.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 123723 files and directories currently installed.) 2026-03-23T18:35:37.862 INFO:teuthology.orchestra.run.vm04.stdout:Removing libcephfs-daemon (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T18:35:37.873 INFO:teuthology.orchestra.run.vm04.stdout:Removing libcephfs-dev (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T18:35:37.884 INFO:teuthology.orchestra.run.vm04.stdout:Removing libcephfs2 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T18:35:37.908 INFO:teuthology.orchestra.run.vm04.stdout:Processing triggers for libc-bin (2.35-0ubuntu3.13) ... 2026-03-23T18:35:38.786 INFO:teuthology.orchestra.run.vm04.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-23T18:35:38.821 INFO:teuthology.orchestra.run.vm04.stdout:Reading package lists... 2026-03-23T18:35:38.955 INFO:teuthology.orchestra.run.vm04.stdout:Building dependency tree... 2026-03-23T18:35:38.955 INFO:teuthology.orchestra.run.vm04.stdout:Reading state information... 2026-03-23T18:35:39.057 INFO:teuthology.orchestra.run.vm04.stdout:Package 'libcephfs-dev' is not installed, so not removed 2026-03-23T18:35:39.057 INFO:teuthology.orchestra.run.vm04.stdout:The following packages were automatically installed and are no longer required: 2026-03-23T18:35:39.057 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-23T18:35:39.057 INFO:teuthology.orchestra.run.vm04.stdout: libboost-thread1.74.0 libcephfs-proxy2 libjq1 liboath0 libonig5 libpmemobj1 2026-03-23T18:35:39.057 INFO:teuthology.orchestra.run.vm04.stdout: libradosstriper1 librdkafka1 librgw2 libsgutils2-2 libsqlite3-mod-ceph 2026-03-23T18:35:39.057 INFO:teuthology.orchestra.run.vm04.stdout: nvme-cli python-asyncssh-doc python3-asyncssh python3-cachetools 2026-03-23T18:35:39.057 INFO:teuthology.orchestra.run.vm04.stdout: python3-ceph-argparse python3-ceph-common python3-cheroot python3-cherrypy3 2026-03-23T18:35:39.057 INFO:teuthology.orchestra.run.vm04.stdout: python3-google-auth python3-jaraco.classes python3-jaraco.collections 2026-03-23T18:35:39.057 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-23T18:35:39.057 INFO:teuthology.orchestra.run.vm04.stdout: python3-kubernetes python3-natsort python3-portend python3-prettytable 2026-03-23T18:35:39.057 INFO:teuthology.orchestra.run.vm04.stdout: python3-psutil python3-repoze.lru python3-requests-oauthlib python3-routes 2026-03-23T18:35:39.058 INFO:teuthology.orchestra.run.vm04.stdout: python3-rsa python3-simplejson python3-sklearn python3-sklearn-lib 2026-03-23T18:35:39.058 INFO:teuthology.orchestra.run.vm04.stdout: python3-tempora python3-threadpoolctl python3-wcwidth python3-webob 2026-03-23T18:35:39.058 INFO:teuthology.orchestra.run.vm04.stdout: python3-websocket python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools 2026-03-23T18:35:39.058 INFO:teuthology.orchestra.run.vm04.stdout: socat xmlstarlet 2026-03-23T18:35:39.058 INFO:teuthology.orchestra.run.vm04.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-23T18:35:39.073 INFO:teuthology.orchestra.run.vm04.stdout:0 upgraded, 0 newly installed, 0 to remove and 37 not upgraded. 2026-03-23T18:35:39.074 INFO:teuthology.orchestra.run.vm04.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-23T18:35:39.108 INFO:teuthology.orchestra.run.vm04.stdout:Reading package lists... 2026-03-23T18:35:39.247 INFO:teuthology.orchestra.run.vm04.stdout:Building dependency tree... 2026-03-23T18:35:39.247 INFO:teuthology.orchestra.run.vm04.stdout:Reading state information... 2026-03-23T18:35:39.349 INFO:teuthology.orchestra.run.vm04.stdout:The following packages were automatically installed and are no longer required: 2026-03-23T18:35:39.349 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-23T18:35:39.349 INFO:teuthology.orchestra.run.vm04.stdout: libboost-thread1.74.0 libcephfs-proxy2 libdouble-conversion3 libfuse2 2026-03-23T18:35:39.349 INFO:teuthology.orchestra.run.vm04.stdout: libgfapi0 libgfrpc0 libgfxdr0 libglusterfs0 libiscsi7 libjq1 liblttng-ust1 2026-03-23T18:35:39.349 INFO:teuthology.orchestra.run.vm04.stdout: libnbd0 liboath0 libonig5 libpcre2-16-0 libpmemobj1 libqt5core5a libqt5dbus5 2026-03-23T18:35:39.349 INFO:teuthology.orchestra.run.vm04.stdout: libqt5network5 librdkafka1 libsgutils2-2 libthrift-0.16.0 nvme-cli 2026-03-23T18:35:39.349 INFO:teuthology.orchestra.run.vm04.stdout: python-asyncssh-doc python3-asyncssh python3-cachetools 2026-03-23T18:35:39.349 INFO:teuthology.orchestra.run.vm04.stdout: python3-ceph-argparse python3-ceph-common python3-cheroot python3-cherrypy3 2026-03-23T18:35:39.349 INFO:teuthology.orchestra.run.vm04.stdout: python3-google-auth python3-jaraco.classes python3-jaraco.collections 2026-03-23T18:35:39.349 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-23T18:35:39.349 INFO:teuthology.orchestra.run.vm04.stdout: python3-kubernetes python3-natsort python3-portend python3-prettytable 2026-03-23T18:35:39.349 INFO:teuthology.orchestra.run.vm04.stdout: python3-psutil python3-repoze.lru python3-requests-oauthlib python3-routes 2026-03-23T18:35:39.349 INFO:teuthology.orchestra.run.vm04.stdout: python3-rsa python3-simplejson python3-sklearn python3-sklearn-lib 2026-03-23T18:35:39.349 INFO:teuthology.orchestra.run.vm04.stdout: python3-tempora python3-threadpoolctl python3-wcwidth python3-webob 2026-03-23T18:35:39.349 INFO:teuthology.orchestra.run.vm04.stdout: python3-websocket python3-zc.lockfile qttranslations5-l10n sg3-utils 2026-03-23T18:35:39.349 INFO:teuthology.orchestra.run.vm04.stdout: sg3-utils-udev smartmontools socat xmlstarlet 2026-03-23T18:35:39.349 INFO:teuthology.orchestra.run.vm04.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-23T18:35:39.357 INFO:teuthology.orchestra.run.vm04.stdout:The following packages will be REMOVED: 2026-03-23T18:35:39.357 INFO:teuthology.orchestra.run.vm04.stdout: librados2* libradosstriper1* librbd1* librgw2* libsqlite3-mod-ceph* 2026-03-23T18:35:39.358 INFO:teuthology.orchestra.run.vm04.stdout: qemu-block-extra* rbd-fuse* 2026-03-23T18:35:39.501 INFO:teuthology.orchestra.run.vm04.stdout:0 upgraded, 0 newly installed, 7 to remove and 37 not upgraded. 2026-03-23T18:35:39.501 INFO:teuthology.orchestra.run.vm04.stdout:After this operation, 59.2 MB disk space will be freed. 2026-03-23T18:35:39.536 INFO:teuthology.orchestra.run.vm04.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 123701 files and directories currently installed.) 2026-03-23T18:35:39.537 INFO:teuthology.orchestra.run.vm04.stdout:Removing rbd-fuse (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T18:35:39.548 INFO:teuthology.orchestra.run.vm04.stdout:Removing libsqlite3-mod-ceph (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T18:35:39.560 INFO:teuthology.orchestra.run.vm04.stdout:Removing libradosstriper1 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T18:35:39.570 INFO:teuthology.orchestra.run.vm04.stdout:Removing qemu-block-extra (1:6.2+dfsg-2ubuntu6.28) ... 2026-03-23T18:35:39.929 INFO:teuthology.orchestra.run.vm04.stdout:Removing librbd1 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T18:35:39.941 INFO:teuthology.orchestra.run.vm04.stdout:Removing librgw2 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T18:35:39.951 INFO:teuthology.orchestra.run.vm04.stdout:Removing librados2 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T18:35:39.976 INFO:teuthology.orchestra.run.vm04.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-23T18:35:40.011 INFO:teuthology.orchestra.run.vm04.stdout:Processing triggers for libc-bin (2.35-0ubuntu3.13) ... 2026-03-23T18:35:40.073 INFO:teuthology.orchestra.run.vm04.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 123650 files and directories currently installed.) 2026-03-23T18:35:40.074 INFO:teuthology.orchestra.run.vm04.stdout:Purging configuration files for qemu-block-extra (1:6.2+dfsg-2ubuntu6.28) ... 2026-03-23T18:35:41.365 INFO:teuthology.orchestra.run.vm04.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-23T18:35:41.399 INFO:teuthology.orchestra.run.vm04.stdout:Reading package lists... 2026-03-23T18:35:41.535 INFO:teuthology.orchestra.run.vm04.stdout:Building dependency tree... 2026-03-23T18:35:41.535 INFO:teuthology.orchestra.run.vm04.stdout:Reading state information... 2026-03-23T18:35:41.636 INFO:teuthology.orchestra.run.vm04.stdout:Package 'librbd1' is not installed, so not removed 2026-03-23T18:35:41.636 INFO:teuthology.orchestra.run.vm04.stdout:The following packages were automatically installed and are no longer required: 2026-03-23T18:35:41.636 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-23T18:35:41.636 INFO:teuthology.orchestra.run.vm04.stdout: libboost-thread1.74.0 libcephfs-proxy2 libdouble-conversion3 libfuse2 2026-03-23T18:35:41.636 INFO:teuthology.orchestra.run.vm04.stdout: libgfapi0 libgfrpc0 libgfxdr0 libglusterfs0 libiscsi7 libjq1 liblttng-ust1 2026-03-23T18:35:41.636 INFO:teuthology.orchestra.run.vm04.stdout: libnbd0 liboath0 libonig5 libpcre2-16-0 libpmemobj1 libqt5core5a libqt5dbus5 2026-03-23T18:35:41.636 INFO:teuthology.orchestra.run.vm04.stdout: libqt5network5 librdkafka1 libsgutils2-2 libthrift-0.16.0 nvme-cli 2026-03-23T18:35:41.636 INFO:teuthology.orchestra.run.vm04.stdout: python-asyncssh-doc python3-asyncssh python3-cachetools 2026-03-23T18:35:41.636 INFO:teuthology.orchestra.run.vm04.stdout: python3-ceph-argparse python3-ceph-common python3-cheroot python3-cherrypy3 2026-03-23T18:35:41.636 INFO:teuthology.orchestra.run.vm04.stdout: python3-google-auth python3-jaraco.classes python3-jaraco.collections 2026-03-23T18:35:41.636 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-23T18:35:41.636 INFO:teuthology.orchestra.run.vm04.stdout: python3-kubernetes python3-natsort python3-portend python3-prettytable 2026-03-23T18:35:41.636 INFO:teuthology.orchestra.run.vm04.stdout: python3-psutil python3-repoze.lru python3-requests-oauthlib python3-routes 2026-03-23T18:35:41.636 INFO:teuthology.orchestra.run.vm04.stdout: python3-rsa python3-simplejson python3-sklearn python3-sklearn-lib 2026-03-23T18:35:41.636 INFO:teuthology.orchestra.run.vm04.stdout: python3-tempora python3-threadpoolctl python3-wcwidth python3-webob 2026-03-23T18:35:41.636 INFO:teuthology.orchestra.run.vm04.stdout: python3-websocket python3-zc.lockfile qttranslations5-l10n sg3-utils 2026-03-23T18:35:41.636 INFO:teuthology.orchestra.run.vm04.stdout: sg3-utils-udev smartmontools socat xmlstarlet 2026-03-23T18:35:41.636 INFO:teuthology.orchestra.run.vm04.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-23T18:35:41.652 INFO:teuthology.orchestra.run.vm04.stdout:0 upgraded, 0 newly installed, 0 to remove and 37 not upgraded. 2026-03-23T18:35:41.652 INFO:teuthology.orchestra.run.vm04.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-23T18:35:41.685 INFO:teuthology.orchestra.run.vm04.stdout:Reading package lists... 2026-03-23T18:35:41.820 INFO:teuthology.orchestra.run.vm04.stdout:Building dependency tree... 2026-03-23T18:35:41.820 INFO:teuthology.orchestra.run.vm04.stdout:Reading state information... 2026-03-23T18:35:41.921 INFO:teuthology.orchestra.run.vm04.stdout:Package 'rbd-fuse' is not installed, so not removed 2026-03-23T18:35:41.921 INFO:teuthology.orchestra.run.vm04.stdout:The following packages were automatically installed and are no longer required: 2026-03-23T18:35:41.921 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-23T18:35:41.921 INFO:teuthology.orchestra.run.vm04.stdout: libboost-thread1.74.0 libcephfs-proxy2 libdouble-conversion3 libfuse2 2026-03-23T18:35:41.921 INFO:teuthology.orchestra.run.vm04.stdout: libgfapi0 libgfrpc0 libgfxdr0 libglusterfs0 libiscsi7 libjq1 liblttng-ust1 2026-03-23T18:35:41.922 INFO:teuthology.orchestra.run.vm04.stdout: libnbd0 liboath0 libonig5 libpcre2-16-0 libpmemobj1 libqt5core5a libqt5dbus5 2026-03-23T18:35:41.922 INFO:teuthology.orchestra.run.vm04.stdout: libqt5network5 librdkafka1 libsgutils2-2 libthrift-0.16.0 nvme-cli 2026-03-23T18:35:41.922 INFO:teuthology.orchestra.run.vm04.stdout: python-asyncssh-doc python3-asyncssh python3-cachetools 2026-03-23T18:35:41.922 INFO:teuthology.orchestra.run.vm04.stdout: python3-ceph-argparse python3-ceph-common python3-cheroot python3-cherrypy3 2026-03-23T18:35:41.922 INFO:teuthology.orchestra.run.vm04.stdout: python3-google-auth python3-jaraco.classes python3-jaraco.collections 2026-03-23T18:35:41.922 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-23T18:35:41.922 INFO:teuthology.orchestra.run.vm04.stdout: python3-kubernetes python3-natsort python3-portend python3-prettytable 2026-03-23T18:35:41.922 INFO:teuthology.orchestra.run.vm04.stdout: python3-psutil python3-repoze.lru python3-requests-oauthlib python3-routes 2026-03-23T18:35:41.922 INFO:teuthology.orchestra.run.vm04.stdout: python3-rsa python3-simplejson python3-sklearn python3-sklearn-lib 2026-03-23T18:35:41.922 INFO:teuthology.orchestra.run.vm04.stdout: python3-tempora python3-threadpoolctl python3-wcwidth python3-webob 2026-03-23T18:35:41.922 INFO:teuthology.orchestra.run.vm04.stdout: python3-websocket python3-zc.lockfile qttranslations5-l10n sg3-utils 2026-03-23T18:35:41.922 INFO:teuthology.orchestra.run.vm04.stdout: sg3-utils-udev smartmontools socat xmlstarlet 2026-03-23T18:35:41.922 INFO:teuthology.orchestra.run.vm04.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-23T18:35:41.938 INFO:teuthology.orchestra.run.vm04.stdout:0 upgraded, 0 newly installed, 0 to remove and 37 not upgraded. 2026-03-23T18:35:41.938 INFO:teuthology.orchestra.run.vm04.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-23T18:35:41.940 DEBUG:teuthology.orchestra.run.vm04:> dpkg -l | grep '^.\(U\|H\)R' | awk '{print $2}' | sudo xargs --no-run-if-empty dpkg -P --force-remove-reinstreq 2026-03-23T18:35:41.994 DEBUG:teuthology.orchestra.run.vm04:> sudo DEBIAN_FRONTEND=noninteractive apt-get -y --force-yes -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" autoremove 2026-03-23T18:35:42.067 INFO:teuthology.orchestra.run.vm04.stdout:Reading package lists... 2026-03-23T18:35:42.205 INFO:teuthology.orchestra.run.vm04.stdout:Building dependency tree... 2026-03-23T18:35:42.206 INFO:teuthology.orchestra.run.vm04.stdout:Reading state information... 2026-03-23T18:35:42.313 INFO:teuthology.orchestra.run.vm04.stdout:The following packages will be REMOVED: 2026-03-23T18:35:42.313 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-23T18:35:42.313 INFO:teuthology.orchestra.run.vm04.stdout: libboost-thread1.74.0 libcephfs-proxy2 libdouble-conversion3 libfuse2 2026-03-23T18:35:42.313 INFO:teuthology.orchestra.run.vm04.stdout: libgfapi0 libgfrpc0 libgfxdr0 libglusterfs0 libiscsi7 libjq1 liblttng-ust1 2026-03-23T18:35:42.314 INFO:teuthology.orchestra.run.vm04.stdout: libnbd0 liboath0 libonig5 libpcre2-16-0 libpmemobj1 libqt5core5a libqt5dbus5 2026-03-23T18:35:42.314 INFO:teuthology.orchestra.run.vm04.stdout: libqt5network5 librdkafka1 libsgutils2-2 libthrift-0.16.0 nvme-cli 2026-03-23T18:35:42.314 INFO:teuthology.orchestra.run.vm04.stdout: python-asyncssh-doc python3-asyncssh python3-cachetools 2026-03-23T18:35:42.314 INFO:teuthology.orchestra.run.vm04.stdout: python3-ceph-argparse python3-ceph-common python3-cheroot python3-cherrypy3 2026-03-23T18:35:42.314 INFO:teuthology.orchestra.run.vm04.stdout: python3-google-auth python3-jaraco.classes python3-jaraco.collections 2026-03-23T18:35:42.314 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-23T18:35:42.314 INFO:teuthology.orchestra.run.vm04.stdout: python3-kubernetes python3-natsort python3-portend python3-prettytable 2026-03-23T18:35:42.314 INFO:teuthology.orchestra.run.vm04.stdout: python3-psutil python3-repoze.lru python3-requests-oauthlib python3-routes 2026-03-23T18:35:42.314 INFO:teuthology.orchestra.run.vm04.stdout: python3-rsa python3-simplejson python3-sklearn python3-sklearn-lib 2026-03-23T18:35:42.314 INFO:teuthology.orchestra.run.vm04.stdout: python3-tempora python3-threadpoolctl python3-wcwidth python3-webob 2026-03-23T18:35:42.314 INFO:teuthology.orchestra.run.vm04.stdout: python3-websocket python3-zc.lockfile qttranslations5-l10n sg3-utils 2026-03-23T18:35:42.314 INFO:teuthology.orchestra.run.vm04.stdout: sg3-utils-udev smartmontools socat xmlstarlet 2026-03-23T18:35:42.457 INFO:teuthology.orchestra.run.vm04.stdout:0 upgraded, 0 newly installed, 64 to remove and 37 not upgraded. 2026-03-23T18:35:42.457 INFO:teuthology.orchestra.run.vm04.stdout:After this operation, 96.8 MB disk space will be freed. 2026-03-23T18:35:42.491 INFO:teuthology.orchestra.run.vm04.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 123650 files and directories currently installed.) 2026-03-23T18:35:42.492 INFO:teuthology.orchestra.run.vm04.stdout:Removing ceph-mgr-modules-core (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T18:35:42.507 INFO:teuthology.orchestra.run.vm04.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/volumes/fs/operations/versions' not empty so not removed 2026-03-23T18:35:42.507 INFO:teuthology.orchestra.run.vm04.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/test_orchestrator' not empty so not removed 2026-03-23T18:35:42.507 INFO:teuthology.orchestra.run.vm04.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/telemetry' not empty so not removed 2026-03-23T18:35:42.507 INFO:teuthology.orchestra.run.vm04.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/telegraf' not empty so not removed 2026-03-23T18:35:42.507 INFO:teuthology.orchestra.run.vm04.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/status' not empty so not removed 2026-03-23T18:35:42.507 INFO:teuthology.orchestra.run.vm04.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/stats/fs' not empty so not removed 2026-03-23T18:35:42.507 INFO:teuthology.orchestra.run.vm04.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/snap_schedule/fs' not empty so not removed 2026-03-23T18:35:42.507 INFO:teuthology.orchestra.run.vm04.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/selftest' not empty so not removed 2026-03-23T18:35:42.507 INFO:teuthology.orchestra.run.vm04.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/rgw' not empty so not removed 2026-03-23T18:35:42.507 INFO:teuthology.orchestra.run.vm04.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/rbd_support' not empty so not removed 2026-03-23T18:35:42.507 INFO:teuthology.orchestra.run.vm04.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/prometheus' not empty so not removed 2026-03-23T18:35:42.507 INFO:teuthology.orchestra.run.vm04.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/progress' not empty so not removed 2026-03-23T18:35:42.507 INFO:teuthology.orchestra.run.vm04.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/pg_autoscaler' not empty so not removed 2026-03-23T18:35:42.507 INFO:teuthology.orchestra.run.vm04.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/osd_support' not empty so not removed 2026-03-23T18:35:42.507 INFO:teuthology.orchestra.run.vm04.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/osd_perf_query' not empty so not removed 2026-03-23T18:35:42.507 INFO:teuthology.orchestra.run.vm04.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/orchestrator' not empty so not removed 2026-03-23T18:35:42.507 INFO:teuthology.orchestra.run.vm04.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/nfs' not empty so not removed 2026-03-23T18:35:42.507 INFO:teuthology.orchestra.run.vm04.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/mirroring/fs/dir_map' not empty so not removed 2026-03-23T18:35:42.507 INFO:teuthology.orchestra.run.vm04.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/localpool' not empty so not removed 2026-03-23T18:35:42.507 INFO:teuthology.orchestra.run.vm04.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/iostat' not empty so not removed 2026-03-23T18:35:42.507 INFO:teuthology.orchestra.run.vm04.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/insights' not empty so not removed 2026-03-23T18:35:42.507 INFO:teuthology.orchestra.run.vm04.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/influx' not empty so not removed 2026-03-23T18:35:42.507 INFO:teuthology.orchestra.run.vm04.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/devicehealth' not empty so not removed 2026-03-23T18:35:42.507 INFO:teuthology.orchestra.run.vm04.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/crash' not empty so not removed 2026-03-23T18:35:42.507 INFO:teuthology.orchestra.run.vm04.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/balancer' not empty so not removed 2026-03-23T18:35:42.507 INFO:teuthology.orchestra.run.vm04.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/alerts' not empty so not removed 2026-03-23T18:35:42.511 INFO:teuthology.orchestra.run.vm04.stdout:Removing jq (1.6-2.1ubuntu3.1) ... 2026-03-23T18:35:42.522 INFO:teuthology.orchestra.run.vm04.stdout:Removing kpartx (0.8.8-1ubuntu1.22.04.4) ... 2026-03-23T18:35:42.532 INFO:teuthology.orchestra.run.vm04.stdout:Removing libboost-iostreams1.74.0:amd64 (1.74.0-14ubuntu3) ... 2026-03-23T18:35:42.543 INFO:teuthology.orchestra.run.vm04.stdout:Removing libboost-thread1.74.0:amd64 (1.74.0-14ubuntu3) ... 2026-03-23T18:35:42.553 INFO:teuthology.orchestra.run.vm04.stdout:Removing libcephfs-proxy2 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T18:35:42.562 INFO:teuthology.orchestra.run.vm04.stdout:Removing libthrift-0.16.0:amd64 (0.16.0-2) ... 2026-03-23T18:35:42.572 INFO:teuthology.orchestra.run.vm04.stdout:Removing libqt5network5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-23T18:35:42.583 INFO:teuthology.orchestra.run.vm04.stdout:Removing libqt5dbus5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-23T18:35:42.592 INFO:teuthology.orchestra.run.vm04.stdout:Removing libqt5core5a:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-23T18:35:42.610 INFO:teuthology.orchestra.run.vm04.stdout:Removing libdouble-conversion3:amd64 (3.1.7-4) ... 2026-03-23T18:35:42.620 INFO:teuthology.orchestra.run.vm04.stdout:Removing libfuse2:amd64 (2.9.9-5ubuntu3) ... 2026-03-23T18:35:42.630 INFO:teuthology.orchestra.run.vm04.stdout:Removing libgfapi0:amd64 (10.1-1ubuntu0.2) ... 2026-03-23T18:35:42.640 INFO:teuthology.orchestra.run.vm04.stdout:Removing libgfrpc0:amd64 (10.1-1ubuntu0.2) ... 2026-03-23T18:35:42.649 INFO:teuthology.orchestra.run.vm04.stdout:Removing libgfxdr0:amd64 (10.1-1ubuntu0.2) ... 2026-03-23T18:35:42.658 INFO:teuthology.orchestra.run.vm04.stdout:Removing libglusterfs0:amd64 (10.1-1ubuntu0.2) ... 2026-03-23T18:35:42.668 INFO:teuthology.orchestra.run.vm04.stdout:Removing libiscsi7:amd64 (1.19.0-3build2) ... 2026-03-23T18:35:42.677 INFO:teuthology.orchestra.run.vm04.stdout:Removing libjq1:amd64 (1.6-2.1ubuntu3.1) ... 2026-03-23T18:35:42.687 INFO:teuthology.orchestra.run.vm04.stdout:Removing liblttng-ust1:amd64 (2.13.1-1ubuntu1) ... 2026-03-23T18:35:42.697 INFO:teuthology.orchestra.run.vm04.stdout:Removing libnbd0 (1.10.5-1) ... 2026-03-23T18:35:42.706 INFO:teuthology.orchestra.run.vm04.stdout:Removing liboath0:amd64 (2.6.7-3ubuntu0.1) ... 2026-03-23T18:35:42.715 INFO:teuthology.orchestra.run.vm04.stdout:Removing libonig5:amd64 (6.9.7.1-2build1) ... 2026-03-23T18:35:42.725 INFO:teuthology.orchestra.run.vm04.stdout:Removing libpcre2-16-0:amd64 (10.39-3ubuntu0.1) ... 2026-03-23T18:35:42.735 INFO:teuthology.orchestra.run.vm04.stdout:Removing libpmemobj1:amd64 (1.11.1-3build1) ... 2026-03-23T18:35:42.745 INFO:teuthology.orchestra.run.vm04.stdout:Removing librdkafka1:amd64 (1.8.0-1build1) ... 2026-03-23T18:35:42.755 INFO:teuthology.orchestra.run.vm04.stdout:Removing sg3-utils-udev (1.46-1ubuntu0.22.04.1) ... 2026-03-23T18:35:42.763 INFO:teuthology.orchestra.run.vm04.stdout:update-initramfs: deferring update (trigger activated) 2026-03-23T18:35:42.772 INFO:teuthology.orchestra.run.vm04.stdout:Removing sg3-utils (1.46-1ubuntu0.22.04.1) ... 2026-03-23T18:35:42.787 INFO:teuthology.orchestra.run.vm04.stdout:Removing libsgutils2-2:amd64 (1.46-1ubuntu0.22.04.1) ... 2026-03-23T18:35:42.796 INFO:teuthology.orchestra.run.vm04.stdout:Removing nvme-cli (1.16-3ubuntu0.3) ... 2026-03-23T18:35:43.178 INFO:teuthology.orchestra.run.vm04.stdout:Removing python-asyncssh-doc (2.5.0-1ubuntu0.1) ... 2026-03-23T18:35:43.191 INFO:teuthology.orchestra.run.vm04.stdout:Removing python3-asyncssh (2.5.0-1ubuntu0.1) ... 2026-03-23T18:35:43.246 INFO:teuthology.orchestra.run.vm04.stdout:Removing python3-kubernetes (12.0.1-1ubuntu1) ... 2026-03-23T18:35:43.494 INFO:teuthology.orchestra.run.vm04.stdout:Removing python3-google-auth (1.5.1-3) ... 2026-03-23T18:35:43.545 INFO:teuthology.orchestra.run.vm04.stdout:Removing python3-cachetools (5.0.0-1) ... 2026-03-23T18:35:43.592 INFO:teuthology.orchestra.run.vm04.stdout:Removing python3-ceph-argparse (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T18:35:43.639 INFO:teuthology.orchestra.run.vm04.stdout:Removing python3-ceph-common (20.2.0-712-g70f8415b-1jammy) ... 2026-03-23T18:35:43.691 INFO:teuthology.orchestra.run.vm04.stdout:Removing python3-cherrypy3 (18.6.1-4) ... 2026-03-23T18:35:43.755 INFO:teuthology.orchestra.run.vm04.stdout:Removing python3-cheroot (8.5.2+ds1-1ubuntu3.1) ... 2026-03-23T18:35:43.802 INFO:teuthology.orchestra.run.vm04.stdout:Removing python3-jaraco.collections (3.4.0-2) ... 2026-03-23T18:35:43.847 INFO:teuthology.orchestra.run.vm04.stdout:Removing python3-jaraco.classes (3.2.1-3) ... 2026-03-23T18:35:43.892 INFO:teuthology.orchestra.run.vm04.stdout:Removing python3-portend (3.0.0-1) ... 2026-03-23T18:35:43.937 INFO:teuthology.orchestra.run.vm04.stdout:Removing python3-tempora (4.1.2-1) ... 2026-03-23T18:35:43.983 INFO:teuthology.orchestra.run.vm04.stdout:Removing python3-jaraco.text (3.6.0-2) ... 2026-03-23T18:35:44.028 INFO:teuthology.orchestra.run.vm04.stdout:Removing python3-jaraco.functools (3.4.0-2) ... 2026-03-23T18:35:44.073 INFO:teuthology.orchestra.run.vm04.stdout:Removing python3-sklearn (0.23.2-5ubuntu6) ... 2026-03-23T18:35:44.190 INFO:teuthology.orchestra.run.vm04.stdout:Removing python3-joblib (0.17.0-4ubuntu1) ... 2026-03-23T18:35:44.245 INFO:teuthology.orchestra.run.vm04.stdout:Removing python3-natsort (8.0.2-1) ... 2026-03-23T18:35:44.292 INFO:teuthology.orchestra.run.vm04.stdout:Removing python3-prettytable (2.5.0-2) ... 2026-03-23T18:35:44.337 INFO:teuthology.orchestra.run.vm04.stdout:Removing python3-psutil (5.9.0-1build1) ... 2026-03-23T18:35:44.385 INFO:teuthology.orchestra.run.vm04.stdout:Removing python3-routes (2.5.1-1ubuntu1) ... 2026-03-23T18:35:44.432 INFO:teuthology.orchestra.run.vm04.stdout:Removing python3-repoze.lru (0.7-2) ... 2026-03-23T18:35:44.477 INFO:teuthology.orchestra.run.vm04.stdout:Removing python3-requests-oauthlib (1.3.0+ds-0.1) ... 2026-03-23T18:35:44.523 INFO:teuthology.orchestra.run.vm04.stdout:Removing python3-rsa (4.8-1) ... 2026-03-23T18:35:44.570 INFO:teuthology.orchestra.run.vm04.stdout:Removing python3-simplejson (3.17.6-1build1) ... 2026-03-23T18:35:44.620 INFO:teuthology.orchestra.run.vm04.stdout:Removing python3-sklearn-lib:amd64 (0.23.2-5ubuntu6) ... 2026-03-23T18:35:44.632 INFO:teuthology.orchestra.run.vm04.stdout:Removing python3-threadpoolctl (3.1.0-1) ... 2026-03-23T18:35:44.677 INFO:teuthology.orchestra.run.vm04.stdout:Removing python3-wcwidth (0.2.5+dfsg1-1) ... 2026-03-23T18:35:44.723 INFO:teuthology.orchestra.run.vm04.stdout:Removing python3-webob (1:1.8.6-1.1ubuntu0.1) ... 2026-03-23T18:35:44.770 INFO:teuthology.orchestra.run.vm04.stdout:Removing python3-websocket (1.2.3-1) ... 2026-03-23T18:35:44.819 INFO:teuthology.orchestra.run.vm04.stdout:Removing python3-zc.lockfile (2.0-1) ... 2026-03-23T18:35:44.865 INFO:teuthology.orchestra.run.vm04.stdout:Removing qttranslations5-l10n (5.15.3-1) ... 2026-03-23T18:35:44.886 INFO:teuthology.orchestra.run.vm04.stdout:Removing smartmontools (7.2-1ubuntu0.1) ... 2026-03-23T18:35:45.246 INFO:teuthology.orchestra.run.vm04.stdout:Removing socat (1.7.4.1-3ubuntu4) ... 2026-03-23T18:35:45.257 INFO:teuthology.orchestra.run.vm04.stdout:Removing xmlstarlet (1.6.1-2.1) ... 2026-03-23T18:35:45.290 INFO:teuthology.orchestra.run.vm04.stdout:Processing triggers for libc-bin (2.35-0ubuntu3.13) ... 2026-03-23T18:35:45.301 INFO:teuthology.orchestra.run.vm04.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-23T18:35:45.349 INFO:teuthology.orchestra.run.vm04.stdout:Processing triggers for initramfs-tools (0.140ubuntu13.5) ... 2026-03-23T18:35:45.365 INFO:teuthology.orchestra.run.vm04.stdout:update-initramfs: Generating /boot/initrd.img-5.15.0-171-generic 2026-03-23T18:35:49.674 INFO:teuthology.orchestra.run.vm04.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-23T18:35:49.677 DEBUG:teuthology.parallel:result is None 2026-03-23T18:35:49.677 INFO:teuthology.task.install:Removing ceph sources lists on ubuntu@vm04.local 2026-03-23T18:35:49.678 DEBUG:teuthology.orchestra.run.vm04:> sudo rm -f /etc/apt/sources.list.d/ceph.list 2026-03-23T18:35:49.725 DEBUG:teuthology.orchestra.run.vm04:> sudo apt-get update 2026-03-23T18:35:50.019 INFO:teuthology.orchestra.run.vm04.stdout:Hit:1 http://security.ubuntu.com/ubuntu jammy-security InRelease 2026-03-23T18:35:50.019 INFO:teuthology.orchestra.run.vm04.stdout:Hit:2 http://archive.ubuntu.com/ubuntu jammy InRelease 2026-03-23T18:35:50.118 INFO:teuthology.orchestra.run.vm04.stdout:Hit:3 http://archive.ubuntu.com/ubuntu jammy-updates InRelease 2026-03-23T18:35:50.220 INFO:teuthology.orchestra.run.vm04.stdout:Hit:4 http://archive.ubuntu.com/ubuntu jammy-backports InRelease 2026-03-23T18:35:52.027 INFO:teuthology.orchestra.run.vm04.stdout:Reading package lists... 2026-03-23T18:35:52.039 DEBUG:teuthology.parallel:result is None 2026-03-23T18:35:52.039 DEBUG:teuthology.run_tasks:Unwinding manager clock 2026-03-23T18:35:52.041 INFO:teuthology.task.clock:Checking final clock skew... 2026-03-23T18:35:52.041 DEBUG:teuthology.orchestra.run.vm04:> PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-23T18:35:52.305 INFO:teuthology.orchestra.run.vm04.stdout: remote refid st t when poll reach delay offset jitter 2026-03-23T18:35:52.305 INFO:teuthology.orchestra.run.vm04.stdout:============================================================================== 2026-03-23T18:35:52.305 INFO:teuthology.orchestra.run.vm04.stdout: 0.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-23T18:35:52.305 INFO:teuthology.orchestra.run.vm04.stdout: 1.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-23T18:35:52.305 INFO:teuthology.orchestra.run.vm04.stdout: 2.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-23T18:35:52.305 INFO:teuthology.orchestra.run.vm04.stdout: 3.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-23T18:35:52.305 INFO:teuthology.orchestra.run.vm04.stdout: ntp.ubuntu.com .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-23T18:35:52.305 INFO:teuthology.orchestra.run.vm04.stdout:-ns1.blazing.de 213.172.96.14 3 u 204 256 377 31.937 -0.010 0.028 2026-03-23T18:35:52.305 INFO:teuthology.orchestra.run.vm04.stdout:+ernie.gerger-ne 213.172.96.14 3 u 130 128 377 31.908 +0.050 0.068 2026-03-23T18:35:52.305 INFO:teuthology.orchestra.run.vm04.stdout:-mx.ae9.eu 152.103.15.66 2 u 9 128 377 25.034 +0.102 0.035 2026-03-23T18:35:52.305 INFO:teuthology.orchestra.run.vm04.stdout:*79.133.44.141 .MBGh. 1 u 16 128 377 20.567 +0.033 0.015 2026-03-23T18:35:52.305 INFO:teuthology.orchestra.run.vm04.stdout:+cp.hypermediaa. 189.97.54.122 2 u 28 128 377 25.065 +0.001 0.042 2026-03-23T18:35:52.306 DEBUG:teuthology.run_tasks:Unwinding manager ansible.cephlab 2026-03-23T18:35:52.308 INFO:teuthology.task.ansible:Skipping ansible cleanup... 2026-03-23T18:35:52.311 DEBUG:teuthology.run_tasks:Unwinding manager selinux 2026-03-23T18:35:52.314 DEBUG:teuthology.run_tasks:Unwinding manager pcp 2026-03-23T18:35:52.316 DEBUG:teuthology.run_tasks:Unwinding manager internal.timer 2026-03-23T18:35:52.318 INFO:teuthology.task.internal:Duration was 4133.457377 seconds 2026-03-23T18:35:52.321 DEBUG:teuthology.run_tasks:Unwinding manager internal.syslog 2026-03-23T18:35:52.323 INFO:teuthology.task.internal.syslog:Shutting down syslog monitoring... 2026-03-23T18:35:52.323 DEBUG:teuthology.orchestra.run.vm04:> sudo rm -f -- /etc/rsyslog.d/80-cephtest.conf && sudo service rsyslog restart 2026-03-23T18:35:52.345 INFO:teuthology.task.internal.syslog:Checking logs for errors... 2026-03-23T18:35:52.345 DEBUG:teuthology.task.internal.syslog:Checking ubuntu@vm04.local 2026-03-23T18:35:52.345 DEBUG:teuthology.orchestra.run.vm04:> grep -E --binary-files=text '\bBUG\b|\bINFO\b|\bDEADLOCK\b' /home/ubuntu/cephtest/archive/syslog/kern.log | grep -v 'task .* blocked for more than .* seconds' | grep -v 'lockdep is turned off' | grep -v 'trying to register non-static key' | grep -v 'DEBUG: fsize' | grep -v CRON | grep -v 'BUG: bad unlock balance detected' | grep -v 'inconsistent lock state' | grep -v '*** DEADLOCK ***' | grep -v 'INFO: possible irq lock inversion dependency detected' | grep -v 'INFO: NMI handler (perf_event_nmi_handler) took too long to run' | grep -v 'INFO: recovery required on readonly' | grep -v 'ceph-create-keys: INFO' | grep -v INFO:ceph-create-keys | grep -v 'Loaded datasource DataSourceOpenStack' | grep -v 'container-storage-setup: INFO: Volume group backing root filesystem could not be determined' | grep -E -v '\bsalt-master\b|\bsalt-minion\b|\bsalt-api\b' | grep -v ceph-crash | grep -E -v '\btcmu-runner\b.*\bINFO\b' | head -n 1 2026-03-23T18:35:52.393 INFO:teuthology.task.internal.syslog:Gathering journactl... 2026-03-23T18:35:52.393 DEBUG:teuthology.orchestra.run.vm04:> sudo journalctl > /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-23T18:35:52.452 INFO:teuthology.task.internal.syslog:Compressing syslogs... 2026-03-23T18:35:52.452 DEBUG:teuthology.orchestra.run.vm04:> find /home/ubuntu/cephtest/archive/syslog -name '*.log' -print0 | sudo xargs -0 --max-args=1 --max-procs=0 --verbose --no-run-if-empty -- gzip -5 --verbose -- 2026-03-23T18:35:52.500 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-23T18:35:52.500 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-23T18:35:52.501 INFO:teuthology.orchestra.run.vm04.stderr:/home/ubuntu/cephtest/archive/syslog/kern.log: gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-23T18:35:52.501 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/kern.log.gz 2026-03-23T18:35:52.501 INFO:teuthology.orchestra.run.vm04.stderr:/home/ubuntu/cephtest/archive/syslog/misc.log: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/misc.log.gz 2026-03-23T18:35:52.504 INFO:teuthology.orchestra.run.vm04.stderr:/home/ubuntu/cephtest/archive/syslog/journalctl.log: 85.9% -- replaced with /home/ubuntu/cephtest/archive/syslog/journalctl.log.gz 2026-03-23T18:35:52.505 DEBUG:teuthology.run_tasks:Unwinding manager internal.sudo 2026-03-23T18:35:52.508 INFO:teuthology.task.internal:Restoring /etc/sudoers... 2026-03-23T18:35:52.508 DEBUG:teuthology.orchestra.run.vm04:> sudo mv -f /etc/sudoers.orig.teuthology /etc/sudoers 2026-03-23T18:35:52.553 DEBUG:teuthology.run_tasks:Unwinding manager internal.coredump 2026-03-23T18:35:52.555 DEBUG:teuthology.orchestra.run.vm04:> sudo sysctl -w kernel.core_pattern=core && sudo bash -c 'for f in `find /home/ubuntu/cephtest/archive/coredump -type f`; do file $f | grep -q systemd-sysusers && rm $f || true ; done' && rmdir --ignore-fail-on-non-empty -- /home/ubuntu/cephtest/archive/coredump 2026-03-23T18:35:52.601 INFO:teuthology.orchestra.run.vm04.stdout:kernel.core_pattern = core 2026-03-23T18:35:52.608 DEBUG:teuthology.orchestra.run.vm04:> test -e /home/ubuntu/cephtest/archive/coredump 2026-03-23T18:35:52.652 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-23T18:35:52.652 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive 2026-03-23T18:35:52.655 INFO:teuthology.task.internal:Transferring archived files... 2026-03-23T18:35:52.657 DEBUG:teuthology.misc:Transferring archived files from vm04:/home/ubuntu/cephtest/archive to /archive/kyr-2026-03-20_22:04:26-rbd-tentacle-none-default-vps/3501/remote/vm04 2026-03-23T18:35:52.658 DEBUG:teuthology.orchestra.run.vm04:> sudo tar c -f - -C /home/ubuntu/cephtest/archive -- . 2026-03-23T18:35:52.701 INFO:teuthology.task.internal:Removing archive directory... 2026-03-23T18:35:52.701 DEBUG:teuthology.orchestra.run.vm04:> rm -rf -- /home/ubuntu/cephtest/archive 2026-03-23T18:35:52.745 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive_upload 2026-03-23T18:35:52.747 INFO:teuthology.task.internal:Not uploading archives. 2026-03-23T18:35:52.747 DEBUG:teuthology.run_tasks:Unwinding manager internal.base 2026-03-23T18:35:52.749 INFO:teuthology.task.internal:Tidying up after the test... 2026-03-23T18:35:52.749 DEBUG:teuthology.orchestra.run.vm04:> find /home/ubuntu/cephtest -ls ; rmdir -- /home/ubuntu/cephtest 2026-03-23T18:35:52.789 INFO:teuthology.orchestra.run.vm04.stdout: 258068 4 drwxr-xr-x 2 ubuntu ubuntu 4096 Mar 23 18:35 /home/ubuntu/cephtest 2026-03-23T18:35:52.789 DEBUG:teuthology.run_tasks:Unwinding manager console_log 2026-03-23T18:35:52.796 INFO:teuthology.run:Summary data: description: rbd/cli/{base/install clusters/{fixed-1} conf/{disable-pool-app} data-pool/ec features/defaults msgr-failures/few objectstore/bluestore-comp-zstd supported-random-distro$/{ubuntu_latest} workloads/rbd_cli_generic} duration: 4133.45737695694 flavor: default owner: kyr success: true 2026-03-23T18:35:52.796 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-23T18:35:52.816 INFO:teuthology.run:pass