2026-03-25T15:24:33.921 INFO:root:teuthology version: 1.2.4.dev6+g1c580df7a 2026-03-25T15:24:33.926 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-25T15:24:33.948 INFO:teuthology.run:Config: archive_path: /archive/kyr-2026-03-20_22:04:26-rbd-tentacle-none-default-vps/3645 branch: tentacle description: rbd/cli_v1/{base/install clusters/{fixed-1} conf/{disable-pool-app} features/format-1 msgr-failures/few objectstore/bluestore-hybrid supported-random-distro$/{centos_latest} workloads/rbd_cli_generic} email: null first_in_suite: false flavor: default job_id: '3645' last_in_suite: false machine_type: vps name: kyr-2026-03-20_22:04:26-rbd-tentacle-none-default-vps no_nested_subset: false os_type: centos os_version: 9.stream overrides: admin_socket: branch: tentacle ansible.cephlab: branch: main repo: https://github.com/kshtsk/ceph-cm-ansible.git skip_tags: nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs vars: logical_volumes: lv_1: scratch_dev: true size: 25%VG vg: vg_nvme lv_2: scratch_dev: true size: 25%VG vg: vg_nvme lv_3: scratch_dev: true size: 25%VG vg: vg_nvme lv_4: scratch_dev: true size: 25%VG vg: vg_nvme timezone: UTC volume_groups: vg_nvme: pvs: /dev/vdb,/dev/vdc,/dev/vdd,/dev/vde ceph: conf: client: rbd default format: 1 global: mon client directed command retry: 5 mon warn on pool no app: false ms inject socket failures: 5000 mgr: debug mgr: 20 debug ms: 1 mon: debug mon: 20 debug ms: 1 debug paxos: 20 osd: bluefs allocator: hybrid bluestore allocator: hybrid bluestore block size: 96636764160 bluestore fsck on mount: true debug bluefs: 1/20 debug bluestore: 1/20 debug ms: 1 debug osd: 20 debug rocksdb: 4/10 mon osd backfillfull_ratio: 0.85 mon osd full ratio: 0.9 mon osd nearfull ratio: 0.8 osd failsafe full ratio: 0.95 osd mclock iops capacity threshold hdd: 49000 osd objectstore: bluestore osd shutdown pgref assert: true flavor: default fs: xfs log-ignorelist: - \(MDS_ALL_DOWN\) - \(MDS_UP_LESS_THAN_MAX\) - \(OSD_SLOW_PING_TIME sha1: 70f8415b300f041766fa27faf7d5472699e32388 ceph-deploy: bluestore: true conf: client: log file: /var/log/ceph/ceph-$name.$pid.log global: osd crush chooseleaf type: 0 osd pool default pg num: 128 osd pool default pgp num: 128 osd pool default size: 2 mon: {} osd: bluestore block size: 96636764160 bluestore fsck on mount: true debug bluefs: 1/20 debug bluestore: 1/20 debug rocksdb: 4/10 mon osd backfillfull_ratio: 0.85 mon osd full ratio: 0.9 mon osd nearfull ratio: 0.8 osd failsafe full ratio: 0.95 osd objectstore: bluestore fs: xfs cephadm: cephadm_binary_url: https://download.ceph.com/rpm-20.2.0/el9/noarch/cephadm install: ceph: flavor: default sha1: 70f8415b300f041766fa27faf7d5472699e32388 extra_system_packages: deb: - python3-jmespath - python3-xmltodict - s3cmd rpm: - bzip2 - perl-Test-Harness - python3-jmespath - python3-xmltodict - s3cmd thrashosds: bdev_inject_crash: 2 bdev_inject_crash_probability: 0.5 workunit: branch: tt-tentacle sha1: 0392f78529848ec72469e8e431875cb98d3a5fb4 owner: kyr priority: 1000 repo: https://github.com/ceph/ceph.git roles: - - mon.a - mgr.x - osd.0 - osd.1 - osd.2 - client.0 seed: 3051 sha1: 70f8415b300f041766fa27faf7d5472699e32388 sleep_before_teardown: 0 subset: 1/128 suite: rbd suite_branch: tt-tentacle suite_path: /home/teuthos/src/github.com_kshtsk_ceph_0392f78529848ec72469e8e431875cb98d3a5fb4/qa suite_relpath: qa suite_repo: https://github.com/kshtsk/ceph.git suite_sha1: 0392f78529848ec72469e8e431875cb98d3a5fb4 targets: vm04.local: ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBMVY6yyFd+yKb0Czj+cE3uCZuJasW7ZXLrRixIHZmdx8en0TD8LaabV085w2jTe0EUcfOMcPCxMAA53W+PlTLSM= tasks: - install: null - ceph: null - workunit: clients: client.0: - rbd/cli_generic.sh teuthology: fragments_dropped: [] meta: {} postmerge: [] teuthology_branch: clyso-debian-13 teuthology_repo: https://github.com/clyso/teuthology teuthology_sha1: 1c580df7a9c7c2aadc272da296344fd99f27c444 timestamp: 2026-03-20_22:04:26 tube: vps user: kyr verbose: false worker_log: /home/teuthos/.teuthology/dispatcher/dispatcher.vps.3013333 2026-03-25T15:24:33.948 INFO:teuthology.run:suite_path is set to /home/teuthos/src/github.com_kshtsk_ceph_0392f78529848ec72469e8e431875cb98d3a5fb4/qa; will attempt to use it 2026-03-25T15:24:33.948 INFO:teuthology.run:Found tasks at /home/teuthos/src/github.com_kshtsk_ceph_0392f78529848ec72469e8e431875cb98d3a5fb4/qa/tasks 2026-03-25T15:24:33.948 INFO:teuthology.run_tasks:Running task internal.check_packages... 2026-03-25T15:24:33.949 INFO:teuthology.task.internal:Checking packages... 2026-03-25T15:24:33.949 INFO:teuthology.task.internal:Checking packages for os_type 'centos', flavor 'default' and ceph hash '70f8415b300f041766fa27faf7d5472699e32388' 2026-03-25T15:24:33.949 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using branch 2026-03-25T15:24:33.949 INFO:teuthology.packaging:ref: None 2026-03-25T15:24:33.949 INFO:teuthology.packaging:tag: None 2026-03-25T15:24:33.949 INFO:teuthology.packaging:branch: tentacle 2026-03-25T15:24:33.949 INFO:teuthology.packaging:sha1: 70f8415b300f041766fa27faf7d5472699e32388 2026-03-25T15:24:33.949 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&ref=tentacle 2026-03-25T15:24:34.683 INFO:teuthology.task.internal:Found packages for ceph version 20.2.0-721.g5bb32787 2026-03-25T15:24:34.684 INFO:teuthology.run_tasks:Running task internal.buildpackages_prep... 2026-03-25T15:24:34.685 INFO:teuthology.task.internal:no buildpackages task found 2026-03-25T15:24:34.685 INFO:teuthology.run_tasks:Running task internal.save_config... 2026-03-25T15:24:34.685 INFO:teuthology.task.internal:Saving configuration 2026-03-25T15:24:34.690 INFO:teuthology.run_tasks:Running task internal.check_lock... 2026-03-25T15:24:34.691 INFO:teuthology.task.internal.check_lock:Checking locks... 2026-03-25T15:24:34.699 DEBUG:teuthology.task.internal.check_lock:machine status is {'name': 'vm04.local', 'description': '/archive/kyr-2026-03-20_22:04:26-rbd-tentacle-none-default-vps/3645', 'up': True, 'machine_type': 'vps', 'is_vm': True, 'vm_host': {'name': 'localhost', 'description': None, 'up': True, 'machine_type': 'libvirt', 'is_vm': False, 'vm_host': None, 'os_type': None, 'os_version': None, 'arch': None, 'locked': True, 'locked_since': None, 'locked_by': None, 'mac_address': None, 'ssh_pub_key': None}, 'os_type': 'centos', 'os_version': '9.stream', 'arch': 'x86_64', 'locked': True, 'locked_since': '2026-03-25 15:23:51.047694', 'locked_by': 'kyr', 'mac_address': '52:55:00:00:00:04', 'ssh_pub_key': 'ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBMVY6yyFd+yKb0Czj+cE3uCZuJasW7ZXLrRixIHZmdx8en0TD8LaabV085w2jTe0EUcfOMcPCxMAA53W+PlTLSM='} 2026-03-25T15:24:34.699 INFO:teuthology.run_tasks:Running task internal.add_remotes... 2026-03-25T15:24:34.699 INFO:teuthology.task.internal:roles: ubuntu@vm04.local - ['mon.a', 'mgr.x', 'osd.0', 'osd.1', 'osd.2', 'client.0'] 2026-03-25T15:24:34.699 INFO:teuthology.run_tasks:Running task console_log... 2026-03-25T15:24:34.705 DEBUG:teuthology.task.console_log:vm04 does not support IPMI; excluding 2026-03-25T15:24:34.705 DEBUG:teuthology.exit:Installing handler: Handler(exiter=, func=.kill_console_loggers at 0x7f266a7e8940>, signals=[15]) 2026-03-25T15:24:34.705 INFO:teuthology.run_tasks:Running task internal.connect... 2026-03-25T15:24:34.706 INFO:teuthology.task.internal:Opening connections... 2026-03-25T15:24:34.706 DEBUG:teuthology.task.internal:connecting to ubuntu@vm04.local 2026-03-25T15:24:34.706 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm04.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-25T15:24:34.765 INFO:teuthology.run_tasks:Running task internal.push_inventory... 2026-03-25T15:24:34.766 DEBUG:teuthology.orchestra.run.vm04:> uname -m 2026-03-25T15:24:34.917 INFO:teuthology.orchestra.run.vm04.stdout:x86_64 2026-03-25T15:24:34.917 DEBUG:teuthology.orchestra.run.vm04:> cat /etc/os-release 2026-03-25T15:24:34.973 INFO:teuthology.orchestra.run.vm04.stdout:NAME="CentOS Stream" 2026-03-25T15:24:34.973 INFO:teuthology.orchestra.run.vm04.stdout:VERSION="9" 2026-03-25T15:24:34.973 INFO:teuthology.orchestra.run.vm04.stdout:ID="centos" 2026-03-25T15:24:34.973 INFO:teuthology.orchestra.run.vm04.stdout:ID_LIKE="rhel fedora" 2026-03-25T15:24:34.973 INFO:teuthology.orchestra.run.vm04.stdout:VERSION_ID="9" 2026-03-25T15:24:34.973 INFO:teuthology.orchestra.run.vm04.stdout:PLATFORM_ID="platform:el9" 2026-03-25T15:24:34.973 INFO:teuthology.orchestra.run.vm04.stdout:PRETTY_NAME="CentOS Stream 9" 2026-03-25T15:24:34.973 INFO:teuthology.orchestra.run.vm04.stdout:ANSI_COLOR="0;31" 2026-03-25T15:24:34.973 INFO:teuthology.orchestra.run.vm04.stdout:LOGO="fedora-logo-icon" 2026-03-25T15:24:34.973 INFO:teuthology.orchestra.run.vm04.stdout:CPE_NAME="cpe:/o:centos:centos:9" 2026-03-25T15:24:34.973 INFO:teuthology.orchestra.run.vm04.stdout:HOME_URL="https://centos.org/" 2026-03-25T15:24:34.973 INFO:teuthology.orchestra.run.vm04.stdout:BUG_REPORT_URL="https://issues.redhat.com/" 2026-03-25T15:24:34.973 INFO:teuthology.orchestra.run.vm04.stdout:REDHAT_SUPPORT_PRODUCT="Red Hat Enterprise Linux 9" 2026-03-25T15:24:34.973 INFO:teuthology.orchestra.run.vm04.stdout:REDHAT_SUPPORT_PRODUCT_VERSION="CentOS Stream" 2026-03-25T15:24:34.973 INFO:teuthology.lock.ops:Updating vm04.local on lock server 2026-03-25T15:24:34.978 INFO:teuthology.run_tasks:Running task internal.serialize_remote_roles... 2026-03-25T15:24:34.980 INFO:teuthology.run_tasks:Running task internal.check_conflict... 2026-03-25T15:24:34.981 INFO:teuthology.task.internal:Checking for old test directory... 2026-03-25T15:24:34.981 DEBUG:teuthology.orchestra.run.vm04:> test '!' -e /home/ubuntu/cephtest 2026-03-25T15:24:35.027 INFO:teuthology.run_tasks:Running task internal.check_ceph_data... 2026-03-25T15:24:35.028 INFO:teuthology.task.internal:Checking for non-empty /var/lib/ceph... 2026-03-25T15:24:35.029 DEBUG:teuthology.orchestra.run.vm04:> test -z $(ls -A /var/lib/ceph) 2026-03-25T15:24:35.083 INFO:teuthology.orchestra.run.vm04.stderr:ls: cannot access '/var/lib/ceph': No such file or directory 2026-03-25T15:24:35.083 INFO:teuthology.run_tasks:Running task internal.vm_setup... 2026-03-25T15:24:35.091 DEBUG:teuthology.orchestra.run.vm04:> test -e /ceph-qa-ready 2026-03-25T15:24:35.137 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-25T15:24:35.316 INFO:teuthology.run_tasks:Running task internal.base... 2026-03-25T15:24:35.318 INFO:teuthology.task.internal:Creating test directory... 2026-03-25T15:24:35.318 DEBUG:teuthology.orchestra.run.vm04:> mkdir -p -m0755 -- /home/ubuntu/cephtest 2026-03-25T15:24:35.335 INFO:teuthology.run_tasks:Running task internal.archive_upload... 2026-03-25T15:24:35.336 INFO:teuthology.run_tasks:Running task internal.archive... 2026-03-25T15:24:35.337 INFO:teuthology.task.internal:Creating archive directory... 2026-03-25T15:24:35.338 DEBUG:teuthology.orchestra.run.vm04:> install -d -m0755 -- /home/ubuntu/cephtest/archive 2026-03-25T15:24:35.392 INFO:teuthology.run_tasks:Running task internal.coredump... 2026-03-25T15:24:35.393 INFO:teuthology.task.internal:Enabling coredump saving... 2026-03-25T15:24:35.393 DEBUG:teuthology.orchestra.run.vm04:> test -f /run/.containerenv -o -f /.dockerenv 2026-03-25T15:24:35.445 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-25T15:24:35.446 DEBUG:teuthology.orchestra.run.vm04:> install -d -m0755 -- /home/ubuntu/cephtest/archive/coredump && sudo sysctl -w kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core && echo kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core | sudo tee -a /etc/sysctl.conf 2026-03-25T15:24:35.510 INFO:teuthology.orchestra.run.vm04.stdout:kernel.core_pattern = /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-25T15:24:35.520 INFO:teuthology.orchestra.run.vm04.stdout:kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-25T15:24:35.521 INFO:teuthology.run_tasks:Running task internal.sudo... 2026-03-25T15:24:35.522 INFO:teuthology.task.internal:Configuring sudo... 2026-03-25T15:24:35.523 DEBUG:teuthology.orchestra.run.vm04:> sudo sed -i.orig.teuthology -e 's/^\([^#]*\) \(requiretty\)/\1 !\2/g' -e 's/^\([^#]*\) !\(visiblepw\)/\1 \2/g' /etc/sudoers 2026-03-25T15:24:35.586 INFO:teuthology.run_tasks:Running task internal.syslog... 2026-03-25T15:24:35.588 INFO:teuthology.task.internal.syslog:Starting syslog monitoring... 2026-03-25T15:24:35.588 DEBUG:teuthology.orchestra.run.vm04:> mkdir -p -m0755 -- /home/ubuntu/cephtest/archive/syslog 2026-03-25T15:24:35.640 DEBUG:teuthology.orchestra.run.vm04:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-25T15:24:35.704 DEBUG:teuthology.orchestra.run.vm04:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-25T15:24:35.761 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-25T15:24:35.761 DEBUG:teuthology.orchestra.run.vm04:> sudo dd of=/etc/rsyslog.d/80-cephtest.conf 2026-03-25T15:24:35.822 DEBUG:teuthology.orchestra.run.vm04:> sudo service rsyslog restart 2026-03-25T15:24:35.890 INFO:teuthology.orchestra.run.vm04.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-25T15:24:36.354 INFO:teuthology.run_tasks:Running task internal.timer... 2026-03-25T15:24:36.356 INFO:teuthology.task.internal:Starting timer... 2026-03-25T15:24:36.356 INFO:teuthology.run_tasks:Running task pcp... 2026-03-25T15:24:36.359 INFO:teuthology.run_tasks:Running task selinux... 2026-03-25T15:24:36.361 INFO:teuthology.task.selinux:Excluding vm04: VMs are not yet supported 2026-03-25T15:24:36.361 DEBUG:teuthology.task.selinux:Getting current SELinux state 2026-03-25T15:24:36.361 DEBUG:teuthology.task.selinux:Existing SELinux modes: {} 2026-03-25T15:24:36.361 INFO:teuthology.task.selinux:Putting SELinux into permissive mode 2026-03-25T15:24:36.361 INFO:teuthology.run_tasks:Running task ansible.cephlab... 2026-03-25T15:24:36.362 DEBUG:teuthology.task:Applying overrides for task ansible.cephlab: {'branch': 'main', 'repo': 'https://github.com/kshtsk/ceph-cm-ansible.git', 'skip_tags': 'nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs', 'vars': {'logical_volumes': {'lv_1': {'scratch_dev': True, 'size': '25%VG', 'vg': 'vg_nvme'}, 'lv_2': {'scratch_dev': True, 'size': '25%VG', 'vg': 'vg_nvme'}, 'lv_3': {'scratch_dev': True, 'size': '25%VG', 'vg': 'vg_nvme'}, 'lv_4': {'scratch_dev': True, 'size': '25%VG', 'vg': 'vg_nvme'}}, 'timezone': 'UTC', 'volume_groups': {'vg_nvme': {'pvs': '/dev/vdb,/dev/vdc,/dev/vdd,/dev/vde'}}}} 2026-03-25T15:24:36.363 DEBUG:teuthology.repo_utils:Setting repo remote to https://github.com/kshtsk/ceph-cm-ansible.git 2026-03-25T15:24:36.364 INFO:teuthology.repo_utils:Fetching github.com_kshtsk_ceph-cm-ansible_main from origin 2026-03-25T15:24:36.918 DEBUG:teuthology.repo_utils:Resetting repo at /home/teuthos/src/github.com_kshtsk_ceph-cm-ansible_main to origin/main 2026-03-25T15:24:36.924 INFO:teuthology.task.ansible:Playbook: [{'import_playbook': 'ansible_managed.yml'}, {'import_playbook': 'teuthology.yml'}, {'hosts': 'testnodes', 'tasks': [{'set_fact': {'ran_from_cephlab_playbook': True}}]}, {'import_playbook': 'testnodes.yml'}, {'import_playbook': 'container-host.yml'}, {'import_playbook': 'cobbler.yml'}, {'import_playbook': 'paddles.yml'}, {'import_playbook': 'pulpito.yml'}, {'hosts': 'testnodes', 'become': True, 'tasks': [{'name': 'Touch /ceph-qa-ready', 'file': {'path': '/ceph-qa-ready', 'state': 'touch'}, 'when': 'ran_from_cephlab_playbook|bool'}]}] 2026-03-25T15:24:36.925 DEBUG:teuthology.task.ansible:Running ansible-playbook -v --extra-vars '{"ansible_ssh_user": "ubuntu", "logical_volumes": {"lv_1": {"scratch_dev": true, "size": "25%VG", "vg": "vg_nvme"}, "lv_2": {"scratch_dev": true, "size": "25%VG", "vg": "vg_nvme"}, "lv_3": {"scratch_dev": true, "size": "25%VG", "vg": "vg_nvme"}, "lv_4": {"scratch_dev": true, "size": "25%VG", "vg": "vg_nvme"}}, "timezone": "UTC", "volume_groups": {"vg_nvme": {"pvs": "/dev/vdb,/dev/vdc,/dev/vdd,/dev/vde"}}}' -i /tmp/teuth_ansible_inventoryiadm5vvj --limit vm04.local /home/teuthos/src/github.com_kshtsk_ceph-cm-ansible_main/cephlab.yml --skip-tags nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs 2026-03-25T15:26:13.660 DEBUG:teuthology.task.ansible:Reconnecting to [Remote(name='ubuntu@vm04.local')] 2026-03-25T15:26:13.661 INFO:teuthology.orchestra.remote:Trying to reconnect to host 'ubuntu@vm04.local' 2026-03-25T15:26:13.661 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm04.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-25T15:26:13.723 DEBUG:teuthology.orchestra.run.vm04:> true 2026-03-25T15:26:13.805 INFO:teuthology.orchestra.remote:Successfully reconnected to host 'ubuntu@vm04.local' 2026-03-25T15:26:13.805 INFO:teuthology.run_tasks:Running task clock... 2026-03-25T15:26:13.808 INFO:teuthology.task.clock:Syncing clocks and checking initial clock skew... 2026-03-25T15:26:13.808 INFO:teuthology.orchestra.run:Running command with timeout 360 2026-03-25T15:26:13.808 DEBUG:teuthology.orchestra.run.vm04:> sudo systemctl stop ntp.service || sudo systemctl stop ntpd.service || sudo systemctl stop chronyd.service ; sudo ntpd -gq || sudo chronyc makestep ; sudo systemctl start ntp.service || sudo systemctl start ntpd.service || sudo systemctl start chronyd.service ; PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-25T15:26:13.889 INFO:teuthology.orchestra.run.vm04.stderr:Failed to stop ntp.service: Unit ntp.service not loaded. 2026-03-25T15:26:13.904 INFO:teuthology.orchestra.run.vm04.stderr:Failed to stop ntpd.service: Unit ntpd.service not loaded. 2026-03-25T15:26:13.933 INFO:teuthology.orchestra.run.vm04.stderr:sudo: ntpd: command not found 2026-03-25T15:26:13.947 INFO:teuthology.orchestra.run.vm04.stdout:506 Cannot talk to daemon 2026-03-25T15:26:13.962 INFO:teuthology.orchestra.run.vm04.stderr:Failed to start ntp.service: Unit ntp.service not found. 2026-03-25T15:26:13.979 INFO:teuthology.orchestra.run.vm04.stderr:Failed to start ntpd.service: Unit ntpd.service not found. 2026-03-25T15:26:14.034 INFO:teuthology.orchestra.run.vm04.stderr:bash: line 1: ntpq: command not found 2026-03-25T15:26:14.508 INFO:teuthology.orchestra.run.vm04.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-25T15:26:14.508 INFO:teuthology.orchestra.run.vm04.stdout:=============================================================================== 2026-03-25T15:26:14.508 INFO:teuthology.orchestra.run.vm04.stdout:^? 185.13.148.71 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-25T15:26:14.508 INFO:teuthology.orchestra.run.vm04.stdout:^? time.cloudflare.com 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-25T15:26:14.508 INFO:teuthology.orchestra.run.vm04.stdout:^? 130.162.237.177 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-25T15:26:14.508 INFO:teuthology.orchestra.run.vm04.stdout:^? ec2-18-192-244-117.eu-ce> 2 6 1 0 -1311us[-1311us] +/- 17ms 2026-03-25T15:26:14.508 INFO:teuthology.run_tasks:Running task install... 2026-03-25T15:26:14.533 DEBUG:teuthology.task.install:project ceph 2026-03-25T15:26:14.539 DEBUG:teuthology.task.install:INSTALL overrides: {'ceph': {'flavor': 'default', 'sha1': '70f8415b300f041766fa27faf7d5472699e32388'}, 'extra_system_packages': {'deb': ['python3-jmespath', 'python3-xmltodict', 's3cmd'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-jmespath', 'python3-xmltodict', 's3cmd']}} 2026-03-25T15:26:14.539 DEBUG:teuthology.task.install:config {'flavor': 'default', 'sha1': '70f8415b300f041766fa27faf7d5472699e32388', 'extra_system_packages': {'deb': ['python3-jmespath', 'python3-xmltodict', 's3cmd'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-jmespath', 'python3-xmltodict', 's3cmd']}} 2026-03-25T15:26:14.539 INFO:teuthology.task.install:Using flavor: default 2026-03-25T15:26:14.542 DEBUG:teuthology.task.install:Package list is: {'deb': ['ceph', 'cephadm', 'ceph-mds', 'ceph-mgr', 'ceph-common', 'ceph-fuse', 'ceph-test', 'ceph-volume', 'radosgw', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'libcephfs2', 'libcephfs-dev', 'librados2', 'librbd1', 'rbd-fuse'], 'rpm': ['ceph-radosgw', 'ceph-test', 'ceph', 'ceph-base', 'cephadm', 'ceph-immutable-object-cache', 'ceph-mgr', 'ceph-mgr-dashboard', 'ceph-mgr-diskprediction-local', 'ceph-mgr-rook', 'ceph-mgr-cephadm', 'ceph-fuse', 'ceph-volume', 'librados-devel', 'libcephfs2', 'libcephfs-devel', 'librados2', 'librbd1', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'rbd-fuse', 'rbd-mirror', 'rbd-nbd']} 2026-03-25T15:26:14.542 INFO:teuthology.task.install:extra packages: [] 2026-03-25T15:26:14.543 DEBUG:teuthology.task.install.rpm:_update_package_list_and_install: config is {'branch': None, 'cleanup': None, 'debuginfo': None, 'downgrade_packages': [], 'exclude_packages': [], 'extra_packages': [], 'extra_system_packages': {'deb': ['python3-jmespath', 'python3-xmltodict', 's3cmd'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-jmespath', 'python3-xmltodict', 's3cmd']}, 'extras': None, 'enable_coprs': [], 'flavor': 'default', 'install_ceph_packages': True, 'packages': {}, 'project': 'ceph', 'repos_only': False, 'sha1': '70f8415b300f041766fa27faf7d5472699e32388', 'tag': None, 'wait_for_package': False} 2026-03-25T15:26:14.543 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=70f8415b300f041766fa27faf7d5472699e32388 2026-03-25T15:26:15.124 INFO:teuthology.task.install.rpm:Pulling from https://3.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/centos/9/flavors/default/ 2026-03-25T15:26:15.124 INFO:teuthology.task.install.rpm:Package version is 20.2.0-712.g70f8415b 2026-03-25T15:26:15.837 INFO:teuthology.packaging:Writing yum repo: [ceph] name=ceph packages for $basearch baseurl=https://3.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/centos/9/flavors/default/$basearch enabled=1 gpgcheck=0 type=rpm-md [ceph-noarch] name=ceph noarch packages baseurl=https://3.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/centos/9/flavors/default/noarch enabled=1 gpgcheck=0 type=rpm-md [ceph-source] name=ceph source packages baseurl=https://3.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/centos/9/flavors/default/SRPMS enabled=1 gpgcheck=0 type=rpm-md 2026-03-25T15:26:15.837 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-25T15:26:15.837 DEBUG:teuthology.orchestra.run.vm04:> sudo dd of=/etc/yum.repos.d/ceph.repo 2026-03-25T15:26:15.865 INFO:teuthology.task.install.rpm:Installing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, ceph-volume, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd, bzip2, perl-Test-Harness, python3-jmespath, python3-xmltodict, s3cmd on remote rpm x86_64 2026-03-25T15:26:15.865 DEBUG:teuthology.orchestra.run.vm04:> if test -f /etc/yum.repos.d/ceph.repo ; then sudo sed -i -e ':a;N;$!ba;s/enabled=1\ngpg/enabled=1\npriority=1\ngpg/g' -e 's;ref/[a-zA-Z0-9_-]*/;sha1/70f8415b300f041766fa27faf7d5472699e32388/;g' /etc/yum.repos.d/ceph.repo ; fi 2026-03-25T15:26:15.938 DEBUG:teuthology.orchestra.run.vm04:> sudo touch -a /etc/yum/pluginconf.d/priorities.conf ; test -e /etc/yum/pluginconf.d/priorities.conf.orig || sudo cp -af /etc/yum/pluginconf.d/priorities.conf /etc/yum/pluginconf.d/priorities.conf.orig 2026-03-25T15:26:16.019 DEBUG:teuthology.orchestra.run.vm04:> grep check_obsoletes /etc/yum/pluginconf.d/priorities.conf && sudo sed -i 's/check_obsoletes.*0/check_obsoletes = 1/g' /etc/yum/pluginconf.d/priorities.conf || echo 'check_obsoletes = 1' | sudo tee -a /etc/yum/pluginconf.d/priorities.conf 2026-03-25T15:26:16.087 INFO:teuthology.orchestra.run.vm04.stdout:check_obsoletes = 1 2026-03-25T15:26:16.088 DEBUG:teuthology.orchestra.run.vm04:> sudo yum clean all 2026-03-25T15:26:16.259 INFO:teuthology.orchestra.run.vm04.stdout:41 files removed 2026-03-25T15:26:16.281 DEBUG:teuthology.orchestra.run.vm04:> sudo yum -y install ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse ceph-volume librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd bzip2 perl-Test-Harness python3-jmespath python3-xmltodict s3cmd 2026-03-25T15:26:18.194 INFO:teuthology.orchestra.run.vm04.stdout:ceph packages for x86_64 52 kB/s | 87 kB 00:01 2026-03-25T15:26:19.783 INFO:teuthology.orchestra.run.vm04.stdout:ceph noarch packages 11 kB/s | 18 kB 00:01 2026-03-25T15:26:21.211 INFO:teuthology.orchestra.run.vm04.stdout:ceph source packages 1.4 kB/s | 1.9 kB 00:01 2026-03-25T15:26:22.343 INFO:teuthology.orchestra.run.vm04.stdout:CentOS Stream 9 - BaseOS 8.0 MB/s | 8.9 MB 00:01 2026-03-25T15:26:26.997 INFO:teuthology.orchestra.run.vm04.stdout:CentOS Stream 9 - AppStream 6.9 MB/s | 27 MB 00:03 2026-03-25T15:26:32.269 INFO:teuthology.orchestra.run.vm04.stdout:CentOS Stream 9 - CRB 3.9 MB/s | 8.0 MB 00:02 2026-03-25T15:26:33.622 INFO:teuthology.orchestra.run.vm04.stdout:CentOS Stream 9 - Extras packages 45 kB/s | 21 kB 00:00 2026-03-25T15:26:34.655 INFO:teuthology.orchestra.run.vm04.stdout:Extra Packages for Enterprise Linux 21 MB/s | 20 MB 00:00 2026-03-25T15:26:40.645 INFO:teuthology.orchestra.run.vm04.stdout:lab-extras 59 kB/s | 50 kB 00:00 2026-03-25T15:26:42.212 INFO:teuthology.orchestra.run.vm04.stdout:Package librados2-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-25T15:26:42.212 INFO:teuthology.orchestra.run.vm04.stdout:Package librbd1-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-25T15:26:42.246 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-25T15:26:42.250 INFO:teuthology.orchestra.run.vm04.stdout:====================================================================================== 2026-03-25T15:26:42.251 INFO:teuthology.orchestra.run.vm04.stdout: Package Arch Version Repository Size 2026-03-25T15:26:42.251 INFO:teuthology.orchestra.run.vm04.stdout:====================================================================================== 2026-03-25T15:26:42.251 INFO:teuthology.orchestra.run.vm04.stdout:Installing: 2026-03-25T15:26:42.251 INFO:teuthology.orchestra.run.vm04.stdout: bzip2 x86_64 1.0.8-11.el9 baseos 55 k 2026-03-25T15:26:42.251 INFO:teuthology.orchestra.run.vm04.stdout: ceph x86_64 2:20.2.0-712.g70f8415b.el9 ceph 6.5 k 2026-03-25T15:26:42.251 INFO:teuthology.orchestra.run.vm04.stdout: ceph-base x86_64 2:20.2.0-712.g70f8415b.el9 ceph 5.9 M 2026-03-25T15:26:42.251 INFO:teuthology.orchestra.run.vm04.stdout: ceph-fuse x86_64 2:20.2.0-712.g70f8415b.el9 ceph 939 k 2026-03-25T15:26:42.251 INFO:teuthology.orchestra.run.vm04.stdout: ceph-immutable-object-cache x86_64 2:20.2.0-712.g70f8415b.el9 ceph 154 k 2026-03-25T15:26:42.251 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr x86_64 2:20.2.0-712.g70f8415b.el9 ceph 962 k 2026-03-25T15:26:42.251 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-cephadm noarch 2:20.2.0-712.g70f8415b.el9 ceph-noarch 173 k 2026-03-25T15:26:42.251 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-dashboard noarch 2:20.2.0-712.g70f8415b.el9 ceph-noarch 11 M 2026-03-25T15:26:42.251 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-diskprediction-local noarch 2:20.2.0-712.g70f8415b.el9 ceph-noarch 7.4 M 2026-03-25T15:26:42.251 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-rook noarch 2:20.2.0-712.g70f8415b.el9 ceph-noarch 50 k 2026-03-25T15:26:42.251 INFO:teuthology.orchestra.run.vm04.stdout: ceph-radosgw x86_64 2:20.2.0-712.g70f8415b.el9 ceph 24 M 2026-03-25T15:26:42.251 INFO:teuthology.orchestra.run.vm04.stdout: ceph-test x86_64 2:20.2.0-712.g70f8415b.el9 ceph 84 M 2026-03-25T15:26:42.251 INFO:teuthology.orchestra.run.vm04.stdout: ceph-volume noarch 2:20.2.0-712.g70f8415b.el9 ceph-noarch 298 k 2026-03-25T15:26:42.251 INFO:teuthology.orchestra.run.vm04.stdout: cephadm noarch 2:20.2.0-712.g70f8415b.el9 ceph-noarch 1.0 M 2026-03-25T15:26:42.251 INFO:teuthology.orchestra.run.vm04.stdout: libcephfs-devel x86_64 2:20.2.0-712.g70f8415b.el9 ceph 34 k 2026-03-25T15:26:42.251 INFO:teuthology.orchestra.run.vm04.stdout: libcephfs2 x86_64 2:20.2.0-712.g70f8415b.el9 ceph 866 k 2026-03-25T15:26:42.251 INFO:teuthology.orchestra.run.vm04.stdout: librados-devel x86_64 2:20.2.0-712.g70f8415b.el9 ceph 126 k 2026-03-25T15:26:42.251 INFO:teuthology.orchestra.run.vm04.stdout: perl-Test-Harness noarch 1:3.42-461.el9 appstream 295 k 2026-03-25T15:26:42.251 INFO:teuthology.orchestra.run.vm04.stdout: python3-cephfs x86_64 2:20.2.0-712.g70f8415b.el9 ceph 163 k 2026-03-25T15:26:42.251 INFO:teuthology.orchestra.run.vm04.stdout: python3-jmespath noarch 1.0.1-1.el9 appstream 48 k 2026-03-25T15:26:42.251 INFO:teuthology.orchestra.run.vm04.stdout: python3-rados x86_64 2:20.2.0-712.g70f8415b.el9 ceph 324 k 2026-03-25T15:26:42.251 INFO:teuthology.orchestra.run.vm04.stdout: python3-rbd x86_64 2:20.2.0-712.g70f8415b.el9 ceph 304 k 2026-03-25T15:26:42.251 INFO:teuthology.orchestra.run.vm04.stdout: python3-rgw x86_64 2:20.2.0-712.g70f8415b.el9 ceph 99 k 2026-03-25T15:26:42.251 INFO:teuthology.orchestra.run.vm04.stdout: python3-xmltodict noarch 0.12.0-15.el9 epel 22 k 2026-03-25T15:26:42.251 INFO:teuthology.orchestra.run.vm04.stdout: rbd-fuse x86_64 2:20.2.0-712.g70f8415b.el9 ceph 91 k 2026-03-25T15:26:42.251 INFO:teuthology.orchestra.run.vm04.stdout: rbd-mirror x86_64 2:20.2.0-712.g70f8415b.el9 ceph 2.9 M 2026-03-25T15:26:42.251 INFO:teuthology.orchestra.run.vm04.stdout: rbd-nbd x86_64 2:20.2.0-712.g70f8415b.el9 ceph 180 k 2026-03-25T15:26:42.251 INFO:teuthology.orchestra.run.vm04.stdout: s3cmd noarch 2.4.0-1.el9 epel 206 k 2026-03-25T15:26:42.251 INFO:teuthology.orchestra.run.vm04.stdout:Upgrading: 2026-03-25T15:26:42.251 INFO:teuthology.orchestra.run.vm04.stdout: librados2 x86_64 2:20.2.0-712.g70f8415b.el9 ceph 3.5 M 2026-03-25T15:26:42.251 INFO:teuthology.orchestra.run.vm04.stdout: librbd1 x86_64 2:20.2.0-712.g70f8415b.el9 ceph 2.8 M 2026-03-25T15:26:42.251 INFO:teuthology.orchestra.run.vm04.stdout:Installing dependencies: 2026-03-25T15:26:42.251 INFO:teuthology.orchestra.run.vm04.stdout: abseil-cpp x86_64 20211102.0-4.el9 epel 551 k 2026-03-25T15:26:42.251 INFO:teuthology.orchestra.run.vm04.stdout: boost-program-options x86_64 1.75.0-13.el9 appstream 104 k 2026-03-25T15:26:42.251 INFO:teuthology.orchestra.run.vm04.stdout: ceph-common x86_64 2:20.2.0-712.g70f8415b.el9 ceph 24 M 2026-03-25T15:26:42.251 INFO:teuthology.orchestra.run.vm04.stdout: ceph-grafana-dashboards noarch 2:20.2.0-712.g70f8415b.el9 ceph-noarch 43 k 2026-03-25T15:26:42.251 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mds x86_64 2:20.2.0-712.g70f8415b.el9 ceph 2.3 M 2026-03-25T15:26:42.251 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-modules-core noarch 2:20.2.0-712.g70f8415b.el9 ceph-noarch 290 k 2026-03-25T15:26:42.251 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mon x86_64 2:20.2.0-712.g70f8415b.el9 ceph 5.0 M 2026-03-25T15:26:42.251 INFO:teuthology.orchestra.run.vm04.stdout: ceph-osd x86_64 2:20.2.0-712.g70f8415b.el9 ceph 17 M 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: ceph-prometheus-alerts noarch 2:20.2.0-712.g70f8415b.el9 ceph-noarch 17 k 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: ceph-selinux x86_64 2:20.2.0-712.g70f8415b.el9 ceph 25 k 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: cryptsetup x86_64 2.8.1-3.el9 baseos 351 k 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas x86_64 3.0.4-9.el9 appstream 30 k 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 appstream 3.0 M 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 appstream 15 k 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: fuse x86_64 2.9.9-17.el9 baseos 80 k 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: gperftools-libs x86_64 2.9.1-3.el9 epel 308 k 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: grpc-data noarch 1.46.7-10.el9 epel 19 k 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: ledmon-libs x86_64 1.1.0-3.el9 baseos 40 k 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: libarrow x86_64 9.0.0-15.el9 epel 4.4 M 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: libarrow-doc noarch 9.0.0-15.el9 epel 25 k 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: libcephfs-proxy2 x86_64 2:20.2.0-712.g70f8415b.el9 ceph 24 k 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: libcephsqlite x86_64 2:20.2.0-712.g70f8415b.el9 ceph 164 k 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: libconfig x86_64 1.7.2-9.el9 baseos 72 k 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: libgfortran x86_64 11.5.0-14.el9 baseos 794 k 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: libnbd x86_64 1.20.3-4.el9 appstream 164 k 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: liboath x86_64 2.6.12-1.el9 epel 49 k 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: libpmemobj x86_64 1.12.1-1.el9 appstream 160 k 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: libquadmath x86_64 11.5.0-14.el9 baseos 184 k 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: librabbitmq x86_64 0.11.0-7.el9 appstream 45 k 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: libradosstriper1 x86_64 2:20.2.0-712.g70f8415b.el9 ceph 250 k 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: librdkafka x86_64 1.6.1-102.el9 appstream 662 k 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: librgw2 x86_64 2:20.2.0-712.g70f8415b.el9 ceph 6.4 M 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: libstoragemgmt x86_64 1.10.1-1.el9 appstream 246 k 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: libunwind x86_64 1.6.2-1.el9 epel 67 k 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: libxslt x86_64 1.1.34-12.el9 appstream 233 k 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: lttng-ust x86_64 2.12.0-6.el9 appstream 292 k 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: lua x86_64 5.4.4-4.el9 appstream 188 k 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: lua-devel x86_64 5.4.4-4.el9 crb 22 k 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: luarocks noarch 3.9.2-5.el9 epel 151 k 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: mailcap noarch 2.1.49-5.el9 baseos 33 k 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: openblas x86_64 0.3.29-1.el9 appstream 42 k 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: openblas-openmp x86_64 0.3.29-1.el9 appstream 5.3 M 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: parquet-libs x86_64 9.0.0-15.el9 epel 838 k 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: pciutils x86_64 3.7.0-7.el9 baseos 93 k 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: perl-Benchmark noarch 1.23-483.el9 appstream 26 k 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: protobuf x86_64 3.14.0-17.el9 appstream 1.0 M 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: protobuf-compiler x86_64 3.14.0-17.el9 crb 862 k 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: python3-asyncssh noarch 2.13.2-5.el9 epel 548 k 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: python3-autocommand noarch 2.2.2-8.el9 epel 29 k 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: python3-babel noarch 2.9.1-2.el9 appstream 6.0 M 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 epel 60 k 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: python3-bcrypt x86_64 3.2.2-1.el9 epel 43 k 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: python3-cachetools noarch 4.2.4-1.el9 epel 32 k 2026-03-25T15:26:42.252 INFO:teuthology.orchestra.run.vm04.stdout: python3-ceph-argparse x86_64 2:20.2.0-712.g70f8415b.el9 ceph 45 k 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: python3-ceph-common x86_64 2:20.2.0-712.g70f8415b.el9 ceph 175 k 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: python3-certifi noarch 2023.05.07-4.el9 epel 14 k 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: python3-cffi x86_64 1.14.5-5.el9 baseos 253 k 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: python3-cheroot noarch 10.0.1-4.el9 epel 173 k 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: python3-cherrypy noarch 18.6.1-2.el9 epel 358 k 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: python3-cryptography x86_64 36.0.1-5.el9 baseos 1.2 M 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: python3-devel x86_64 3.9.25-3.el9 appstream 244 k 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: python3-google-auth noarch 1:2.45.0-1.el9 epel 254 k 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: python3-grpcio x86_64 1.46.7-10.el9 epel 2.0 M 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: python3-grpcio-tools x86_64 1.46.7-10.el9 epel 144 k 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco noarch 8.2.1-3.el9 epel 11 k 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 epel 18 k 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 epel 23 k 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-context noarch 6.0.1-3.el9 epel 20 k 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 epel 19 k 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-text noarch 4.0.0-2.el9 epel 26 k 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: python3-jinja2 noarch 2.11.3-8.el9 appstream 249 k 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 epel 1.0 M 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 appstream 177 k 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: python3-markupsafe x86_64 1.1.1-12.el9 appstream 35 k 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: python3-more-itertools noarch 8.12.0-2.el9 epel 79 k 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: python3-natsort noarch 7.1.1-5.el9 epel 58 k 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: python3-numpy x86_64 1:1.23.5-2.el9 appstream 6.1 M 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 appstream 442 k 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: python3-packaging noarch 20.9-5.el9 appstream 77 k 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: python3-ply noarch 3.11-14.el9 baseos 106 k 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: python3-portend noarch 3.1.0-2.el9 epel 16 k 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: python3-protobuf noarch 3.14.0-17.el9 appstream 267 k 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 epel 90 k 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyasn1 noarch 0.4.8-7.el9 appstream 157 k 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 appstream 277 k 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: python3-pycparser noarch 2.20-6.el9 baseos 135 k 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyparsing noarch 2.4.7-9.el9 baseos 150 k 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: python3-repoze-lru noarch 0.7-16.el9 epel 31 k 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: python3-requests noarch 2.25.1-10.el9 baseos 126 k 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 appstream 54 k 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: python3-routes noarch 2.5.1-5.el9 epel 188 k 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: python3-rsa noarch 4.9-2.el9 epel 59 k 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: python3-scipy x86_64 1.9.3-2.el9 appstream 19 M 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: python3-tempora noarch 5.0.0-2.el9 epel 36 k 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: python3-toml noarch 0.10.2-6.el9 appstream 42 k 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: python3-typing-extensions noarch 4.15.0-1.el9 epel 86 k 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: python3-urllib3 noarch 1.26.5-7.el9 baseos 218 k 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: python3-websocket-client noarch 1.2.3-2.el9 epel 90 k 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: python3-zc-lockfile noarch 2.0-10.el9 epel 20 k 2026-03-25T15:26:42.253 INFO:teuthology.orchestra.run.vm04.stdout: qatlib x86_64 25.08.0-2.el9 appstream 240 k 2026-03-25T15:26:42.254 INFO:teuthology.orchestra.run.vm04.stdout: qatzip-libs x86_64 1.3.1-1.el9 appstream 66 k 2026-03-25T15:26:42.254 INFO:teuthology.orchestra.run.vm04.stdout: re2 x86_64 1:20211101-20.el9 epel 191 k 2026-03-25T15:26:42.254 INFO:teuthology.orchestra.run.vm04.stdout: socat x86_64 1.7.4.1-8.el9 appstream 303 k 2026-03-25T15:26:42.254 INFO:teuthology.orchestra.run.vm04.stdout: thrift x86_64 0.15.0-4.el9 epel 1.6 M 2026-03-25T15:26:42.254 INFO:teuthology.orchestra.run.vm04.stdout: unzip x86_64 6.0-59.el9 baseos 182 k 2026-03-25T15:26:42.254 INFO:teuthology.orchestra.run.vm04.stdout: xmlstarlet x86_64 1.6.1-20.el9 appstream 64 k 2026-03-25T15:26:42.254 INFO:teuthology.orchestra.run.vm04.stdout: zip x86_64 3.0-35.el9 baseos 266 k 2026-03-25T15:26:42.254 INFO:teuthology.orchestra.run.vm04.stdout:Installing weak dependencies: 2026-03-25T15:26:42.254 INFO:teuthology.orchestra.run.vm04.stdout: qatlib-service x86_64 25.08.0-2.el9 appstream 37 k 2026-03-25T15:26:42.254 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:26:42.254 INFO:teuthology.orchestra.run.vm04.stdout:Transaction Summary 2026-03-25T15:26:42.254 INFO:teuthology.orchestra.run.vm04.stdout:====================================================================================== 2026-03-25T15:26:42.254 INFO:teuthology.orchestra.run.vm04.stdout:Install 136 Packages 2026-03-25T15:26:42.254 INFO:teuthology.orchestra.run.vm04.stdout:Upgrade 2 Packages 2026-03-25T15:26:42.254 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:26:42.254 INFO:teuthology.orchestra.run.vm04.stdout:Total download size: 267 M 2026-03-25T15:26:42.254 INFO:teuthology.orchestra.run.vm04.stdout:Downloading Packages: 2026-03-25T15:26:44.151 INFO:teuthology.orchestra.run.vm04.stdout:(1/138): ceph-20.2.0-712.g70f8415b.el9.x86_64.r 9.8 kB/s | 6.5 kB 00:00 2026-03-25T15:26:45.710 INFO:teuthology.orchestra.run.vm04.stdout:(2/138): ceph-fuse-20.2.0-712.g70f8415b.el9.x86 603 kB/s | 939 kB 00:01 2026-03-25T15:26:45.907 INFO:teuthology.orchestra.run.vm04.stdout:(3/138): ceph-immutable-object-cache-20.2.0-712 779 kB/s | 154 kB 00:00 2026-03-25T15:26:46.350 INFO:teuthology.orchestra.run.vm04.stdout:(4/138): ceph-base-20.2.0-712.g70f8415b.el9.x86 2.1 MB/s | 5.9 MB 00:02 2026-03-25T15:26:46.625 INFO:teuthology.orchestra.run.vm04.stdout:(5/138): ceph-mgr-20.2.0-712.g70f8415b.el9.x86_ 3.4 MB/s | 962 kB 00:00 2026-03-25T15:26:46.975 INFO:teuthology.orchestra.run.vm04.stdout:(6/138): ceph-mds-20.2.0-712.g70f8415b.el9.x86_ 2.2 MB/s | 2.3 MB 00:01 2026-03-25T15:26:47.748 INFO:teuthology.orchestra.run.vm04.stdout:(7/138): ceph-mon-20.2.0-712.g70f8415b.el9.x86_ 4.5 MB/s | 5.0 MB 00:01 2026-03-25T15:26:48.576 INFO:teuthology.orchestra.run.vm04.stdout:(8/138): ceph-common-20.2.0-712.g70f8415b.el9.x 4.7 MB/s | 24 MB 00:05 2026-03-25T15:26:48.718 INFO:teuthology.orchestra.run.vm04.stdout:(9/138): ceph-selinux-20.2.0-712.g70f8415b.el9. 177 kB/s | 25 kB 00:00 2026-03-25T15:26:51.284 INFO:teuthology.orchestra.run.vm04.stdout:(10/138): ceph-radosgw-20.2.0-712.g70f8415b.el9 6.7 MB/s | 24 MB 00:03 2026-03-25T15:26:51.432 INFO:teuthology.orchestra.run.vm04.stdout:(11/138): libcephfs-devel-20.2.0-712.g70f8415b. 233 kB/s | 34 kB 00:00 2026-03-25T15:26:51.830 INFO:teuthology.orchestra.run.vm04.stdout:(12/138): libcephfs-proxy2-20.2.0-712.g70f8415b 61 kB/s | 24 kB 00:00 2026-03-25T15:26:52.115 INFO:teuthology.orchestra.run.vm04.stdout:(13/138): libcephfs2-20.2.0-712.g70f8415b.el9.x 3.0 MB/s | 866 kB 00:00 2026-03-25T15:26:52.275 INFO:teuthology.orchestra.run.vm04.stdout:(14/138): ceph-osd-20.2.0-712.g70f8415b.el9.x86 3.2 MB/s | 17 MB 00:05 2026-03-25T15:26:52.276 INFO:teuthology.orchestra.run.vm04.stdout:(15/138): libcephsqlite-20.2.0-712.g70f8415b.el 1.0 MB/s | 164 kB 00:00 2026-03-25T15:26:52.414 INFO:teuthology.orchestra.run.vm04.stdout:(16/138): librados-devel-20.2.0-712.g70f8415b.e 905 kB/s | 126 kB 00:00 2026-03-25T15:26:52.416 INFO:teuthology.orchestra.run.vm04.stdout:(17/138): libradosstriper1-20.2.0-712.g70f8415b 1.7 MB/s | 250 kB 00:00 2026-03-25T15:26:52.566 INFO:teuthology.orchestra.run.vm04.stdout:(18/138): python3-ceph-argparse-20.2.0-712.g70f 302 kB/s | 45 kB 00:00 2026-03-25T15:26:52.714 INFO:teuthology.orchestra.run.vm04.stdout:(19/138): python3-ceph-common-20.2.0-712.g70f84 1.2 MB/s | 175 kB 00:00 2026-03-25T15:26:52.862 INFO:teuthology.orchestra.run.vm04.stdout:(20/138): python3-cephfs-20.2.0-712.g70f8415b.e 1.1 MB/s | 163 kB 00:00 2026-03-25T15:26:53.014 INFO:teuthology.orchestra.run.vm04.stdout:(21/138): python3-rados-20.2.0-712.g70f8415b.el 2.1 MB/s | 324 kB 00:00 2026-03-25T15:26:53.167 INFO:teuthology.orchestra.run.vm04.stdout:(22/138): python3-rbd-20.2.0-712.g70f8415b.el9. 1.9 MB/s | 304 kB 00:00 2026-03-25T15:26:53.314 INFO:teuthology.orchestra.run.vm04.stdout:(23/138): python3-rgw-20.2.0-712.g70f8415b.el9. 677 kB/s | 99 kB 00:00 2026-03-25T15:26:53.467 INFO:teuthology.orchestra.run.vm04.stdout:(24/138): rbd-fuse-20.2.0-712.g70f8415b.el9.x86 595 kB/s | 91 kB 00:00 2026-03-25T15:26:53.923 INFO:teuthology.orchestra.run.vm04.stdout:(25/138): rbd-mirror-20.2.0-712.g70f8415b.el9.x 6.4 MB/s | 2.9 MB 00:00 2026-03-25T15:26:54.076 INFO:teuthology.orchestra.run.vm04.stdout:(26/138): rbd-nbd-20.2.0-712.g70f8415b.el9.x86_ 1.1 MB/s | 180 kB 00:00 2026-03-25T15:26:54.229 INFO:teuthology.orchestra.run.vm04.stdout:(27/138): ceph-grafana-dashboards-20.2.0-712.g7 284 kB/s | 43 kB 00:00 2026-03-25T15:26:54.366 INFO:teuthology.orchestra.run.vm04.stdout:(28/138): librgw2-20.2.0-712.g70f8415b.el9.x86_ 3.3 MB/s | 6.4 MB 00:01 2026-03-25T15:26:54.374 INFO:teuthology.orchestra.run.vm04.stdout:(29/138): ceph-mgr-cephadm-20.2.0-712.g70f8415b 1.2 MB/s | 173 kB 00:00 2026-03-25T15:26:55.398 INFO:teuthology.orchestra.run.vm04.stdout:(30/138): ceph-mgr-diskprediction-local-20.2.0- 7.2 MB/s | 7.4 MB 00:01 2026-03-25T15:26:55.537 INFO:teuthology.orchestra.run.vm04.stdout:(31/138): ceph-mgr-modules-core-20.2.0-712.g70f 2.0 MB/s | 290 kB 00:00 2026-03-25T15:26:55.672 INFO:teuthology.orchestra.run.vm04.stdout:(32/138): ceph-mgr-rook-20.2.0-712.g70f8415b.el 374 kB/s | 50 kB 00:00 2026-03-25T15:26:55.809 INFO:teuthology.orchestra.run.vm04.stdout:(33/138): ceph-prometheus-alerts-20.2.0-712.g70 126 kB/s | 17 kB 00:00 2026-03-25T15:26:55.954 INFO:teuthology.orchestra.run.vm04.stdout:(34/138): ceph-volume-20.2.0-712.g70f8415b.el9. 2.0 MB/s | 298 kB 00:00 2026-03-25T15:26:56.114 INFO:teuthology.orchestra.run.vm04.stdout:(35/138): cephadm-20.2.0-712.g70f8415b.el9.noar 6.2 MB/s | 1.0 MB 00:00 2026-03-25T15:26:56.247 INFO:teuthology.orchestra.run.vm04.stdout:(36/138): bzip2-1.0.8-11.el9.x86_64.rpm 413 kB/s | 55 kB 00:00 2026-03-25T15:26:56.333 INFO:teuthology.orchestra.run.vm04.stdout:(37/138): cryptsetup-2.8.1-3.el9.x86_64.rpm 4.0 MB/s | 351 kB 00:00 2026-03-25T15:26:56.367 INFO:teuthology.orchestra.run.vm04.stdout:(38/138): fuse-2.9.9-17.el9.x86_64.rpm 2.3 MB/s | 80 kB 00:00 2026-03-25T15:26:56.397 INFO:teuthology.orchestra.run.vm04.stdout:(39/138): ledmon-libs-1.1.0-3.el9.x86_64.rpm 1.3 MB/s | 40 kB 00:00 2026-03-25T15:26:56.427 INFO:teuthology.orchestra.run.vm04.stdout:(40/138): libconfig-1.7.2-9.el9.x86_64.rpm 2.4 MB/s | 72 kB 00:00 2026-03-25T15:26:56.490 INFO:teuthology.orchestra.run.vm04.stdout:(41/138): libgfortran-11.5.0-14.el9.x86_64.rpm 12 MB/s | 794 kB 00:00 2026-03-25T15:26:56.521 INFO:teuthology.orchestra.run.vm04.stdout:(42/138): libquadmath-11.5.0-14.el9.x86_64.rpm 5.9 MB/s | 184 kB 00:00 2026-03-25T15:26:56.549 INFO:teuthology.orchestra.run.vm04.stdout:(43/138): mailcap-2.1.49-5.el9.noarch.rpm 1.1 MB/s | 33 kB 00:00 2026-03-25T15:26:56.579 INFO:teuthology.orchestra.run.vm04.stdout:(44/138): pciutils-3.7.0-7.el9.x86_64.rpm 3.0 MB/s | 93 kB 00:00 2026-03-25T15:26:56.612 INFO:teuthology.orchestra.run.vm04.stdout:(45/138): python3-cffi-1.14.5-5.el9.x86_64.rpm 7.7 MB/s | 253 kB 00:00 2026-03-25T15:26:56.747 INFO:teuthology.orchestra.run.vm04.stdout:(46/138): python3-cryptography-36.0.1-5.el9.x86 9.3 MB/s | 1.2 MB 00:00 2026-03-25T15:26:56.778 INFO:teuthology.orchestra.run.vm04.stdout:(47/138): python3-ply-3.11-14.el9.noarch.rpm 3.4 MB/s | 106 kB 00:00 2026-03-25T15:26:56.809 INFO:teuthology.orchestra.run.vm04.stdout:(48/138): python3-pycparser-2.20-6.el9.noarch.r 4.4 MB/s | 135 kB 00:00 2026-03-25T15:26:56.840 INFO:teuthology.orchestra.run.vm04.stdout:(49/138): python3-pyparsing-2.4.7-9.el9.noarch. 4.8 MB/s | 150 kB 00:00 2026-03-25T15:26:56.871 INFO:teuthology.orchestra.run.vm04.stdout:(50/138): python3-requests-2.25.1-10.el9.noarch 4.1 MB/s | 126 kB 00:00 2026-03-25T15:26:56.948 INFO:teuthology.orchestra.run.vm04.stdout:(51/138): python3-urllib3-1.26.5-7.el9.noarch.r 2.8 MB/s | 218 kB 00:00 2026-03-25T15:26:57.039 INFO:teuthology.orchestra.run.vm04.stdout:(52/138): unzip-6.0-59.el9.x86_64.rpm 2.0 MB/s | 182 kB 00:00 2026-03-25T15:26:57.156 INFO:teuthology.orchestra.run.vm04.stdout:(53/138): zip-3.0-35.el9.x86_64.rpm 2.2 MB/s | 266 kB 00:00 2026-03-25T15:26:57.498 INFO:teuthology.orchestra.run.vm04.stdout:(54/138): ceph-test-20.2.0-712.g70f8415b.el9.x8 9.6 MB/s | 84 MB 00:08 2026-03-25T15:26:57.500 INFO:teuthology.orchestra.run.vm04.stdout:(55/138): boost-program-options-1.75.0-13.el9.x 302 kB/s | 104 kB 00:00 2026-03-25T15:26:57.535 INFO:teuthology.orchestra.run.vm04.stdout:(56/138): ceph-mgr-dashboard-20.2.0-712.g70f841 3.3 MB/s | 11 MB 00:03 2026-03-25T15:26:57.592 INFO:teuthology.orchestra.run.vm04.stdout:(57/138): flexiblas-3.0.4-9.el9.x86_64.rpm 315 kB/s | 30 kB 00:00 2026-03-25T15:26:57.608 INFO:teuthology.orchestra.run.vm04.stdout:(58/138): flexiblas-netlib-3.0.4-9.el9.x86_64.r 28 MB/s | 3.0 MB 00:00 2026-03-25T15:26:57.620 INFO:teuthology.orchestra.run.vm04.stdout:(59/138): flexiblas-openblas-openmp-3.0.4-9.el9 177 kB/s | 15 kB 00:00 2026-03-25T15:26:57.639 INFO:teuthology.orchestra.run.vm04.stdout:(60/138): libpmemobj-1.12.1-1.el9.x86_64.rpm 5.0 MB/s | 160 kB 00:00 2026-03-25T15:26:57.676 INFO:teuthology.orchestra.run.vm04.stdout:(61/138): librdkafka-1.6.1-102.el9.x86_64.rpm 18 MB/s | 662 kB 00:00 2026-03-25T15:26:57.677 INFO:teuthology.orchestra.run.vm04.stdout:(62/138): librabbitmq-0.11.0-7.el9.x86_64.rpm 788 kB/s | 45 kB 00:00 2026-03-25T15:26:57.705 INFO:teuthology.orchestra.run.vm04.stdout:(63/138): libnbd-1.20.3-4.el9.x86_64.rpm 1.4 MB/s | 164 kB 00:00 2026-03-25T15:26:57.736 INFO:teuthology.orchestra.run.vm04.stdout:(64/138): libstoragemgmt-1.10.1-1.el9.x86_64.rp 4.1 MB/s | 246 kB 00:00 2026-03-25T15:26:57.763 INFO:teuthology.orchestra.run.vm04.stdout:(65/138): lttng-ust-2.12.0-6.el9.x86_64.rpm 4.9 MB/s | 292 kB 00:00 2026-03-25T15:26:57.764 INFO:teuthology.orchestra.run.vm04.stdout:(66/138): libxslt-1.1.34-12.el9.x86_64.rpm 2.6 MB/s | 233 kB 00:00 2026-03-25T15:26:57.766 INFO:teuthology.orchestra.run.vm04.stdout:(67/138): lua-5.4.4-4.el9.x86_64.rpm 6.0 MB/s | 188 kB 00:00 2026-03-25T15:26:57.793 INFO:teuthology.orchestra.run.vm04.stdout:(68/138): openblas-0.3.29-1.el9.x86_64.rpm 1.4 MB/s | 42 kB 00:00 2026-03-25T15:26:57.797 INFO:teuthology.orchestra.run.vm04.stdout:(69/138): perl-Benchmark-1.23-483.el9.noarch.rp 878 kB/s | 26 kB 00:00 2026-03-25T15:26:57.827 INFO:teuthology.orchestra.run.vm04.stdout:(70/138): perl-Test-Harness-3.42-461.el9.noarch 8.5 MB/s | 295 kB 00:00 2026-03-25T15:26:57.951 INFO:teuthology.orchestra.run.vm04.stdout:(71/138): openblas-openmp-0.3.29-1.el9.x86_64.r 28 MB/s | 5.3 MB 00:00 2026-03-25T15:26:57.986 INFO:teuthology.orchestra.run.vm04.stdout:(72/138): python3-devel-3.9.25-3.el9.x86_64.rpm 7.0 MB/s | 244 kB 00:00 2026-03-25T15:26:58.019 INFO:teuthology.orchestra.run.vm04.stdout:(73/138): python3-jinja2-2.11.3-8.el9.noarch.rp 7.5 MB/s | 249 kB 00:00 2026-03-25T15:26:58.078 INFO:teuthology.orchestra.run.vm04.stdout:(74/138): python3-jmespath-1.0.1-1.el9.noarch.r 809 kB/s | 48 kB 00:00 2026-03-25T15:26:58.109 INFO:teuthology.orchestra.run.vm04.stdout:(75/138): python3-libstoragemgmt-1.10.1-1.el9.x 5.6 MB/s | 177 kB 00:00 2026-03-25T15:26:58.150 INFO:teuthology.orchestra.run.vm04.stdout:(76/138): python3-babel-2.9.1-2.el9.noarch.rpm 18 MB/s | 6.0 MB 00:00 2026-03-25T15:26:58.150 INFO:teuthology.orchestra.run.vm04.stdout:(77/138): python3-markupsafe-1.1.1-12.el9.x86_6 847 kB/s | 35 kB 00:00 2026-03-25T15:26:58.183 INFO:teuthology.orchestra.run.vm04.stdout:(78/138): protobuf-3.14.0-17.el9.x86_64.rpm 2.6 MB/s | 1.0 MB 00:00 2026-03-25T15:26:58.203 INFO:teuthology.orchestra.run.vm04.stdout:(79/138): python3-numpy-f2py-1.23.5-2.el9.x86_6 8.3 MB/s | 442 kB 00:00 2026-03-25T15:26:58.230 INFO:teuthology.orchestra.run.vm04.stdout:(80/138): python3-packaging-20.9-5.el9.noarch.r 1.6 MB/s | 77 kB 00:00 2026-03-25T15:26:58.245 INFO:teuthology.orchestra.run.vm04.stdout:(81/138): python3-protobuf-3.14.0-17.el9.noarch 6.2 MB/s | 267 kB 00:00 2026-03-25T15:26:58.260 INFO:teuthology.orchestra.run.vm04.stdout:(82/138): python3-pyasn1-0.4.8-7.el9.noarch.rpm 5.0 MB/s | 157 kB 00:00 2026-03-25T15:26:58.277 INFO:teuthology.orchestra.run.vm04.stdout:(83/138): python3-pyasn1-modules-0.4.8-7.el9.no 8.5 MB/s | 277 kB 00:00 2026-03-25T15:26:58.290 INFO:teuthology.orchestra.run.vm04.stdout:(84/138): python3-requests-oauthlib-1.3.0-12.el 1.8 MB/s | 54 kB 00:00 2026-03-25T15:26:58.319 INFO:teuthology.orchestra.run.vm04.stdout:(85/138): python3-toml-0.10.2-6.el9.noarch.rpm 1.4 MB/s | 42 kB 00:00 2026-03-25T15:26:58.352 INFO:teuthology.orchestra.run.vm04.stdout:(86/138): qatlib-25.08.0-2.el9.x86_64.rpm 7.2 MB/s | 240 kB 00:00 2026-03-25T15:26:58.382 INFO:teuthology.orchestra.run.vm04.stdout:(87/138): qatlib-service-25.08.0-2.el9.x86_64.r 1.2 MB/s | 37 kB 00:00 2026-03-25T15:26:58.426 INFO:teuthology.orchestra.run.vm04.stdout:(88/138): python3-numpy-1.23.5-2.el9.x86_64.rpm 22 MB/s | 6.1 MB 00:00 2026-03-25T15:26:58.427 INFO:teuthology.orchestra.run.vm04.stdout:(89/138): qatzip-libs-1.3.1-1.el9.x86_64.rpm 1.5 MB/s | 66 kB 00:00 2026-03-25T15:26:58.459 INFO:teuthology.orchestra.run.vm04.stdout:(90/138): socat-1.7.4.1-8.el9.x86_64.rpm 9.0 MB/s | 303 kB 00:00 2026-03-25T15:26:58.460 INFO:teuthology.orchestra.run.vm04.stdout:(91/138): xmlstarlet-1.6.1-20.el9.x86_64.rpm 1.9 MB/s | 64 kB 00:00 2026-03-25T15:26:58.603 INFO:teuthology.orchestra.run.vm04.stdout:(92/138): lua-devel-5.4.4-4.el9.x86_64.rpm 156 kB/s | 22 kB 00:00 2026-03-25T15:26:59.055 INFO:teuthology.orchestra.run.vm04.stdout:(93/138): python3-scipy-1.9.3-2.el9.x86_64.rpm 25 MB/s | 19 MB 00:00 2026-03-25T15:26:59.088 INFO:teuthology.orchestra.run.vm04.stdout:(94/138): protobuf-compiler-3.14.0-17.el9.x86_6 1.3 MB/s | 862 kB 00:00 2026-03-25T15:26:59.283 INFO:teuthology.orchestra.run.vm04.stdout:(95/138): grpc-data-1.46.7-10.el9.noarch.rpm 100 kB/s | 19 kB 00:00 2026-03-25T15:26:59.338 INFO:teuthology.orchestra.run.vm04.stdout:(96/138): abseil-cpp-20211102.0-4.el9.x86_64.rp 750 kB/s | 551 kB 00:00 2026-03-25T15:26:59.341 INFO:teuthology.orchestra.run.vm04.stdout:(97/138): gperftools-libs-2.9.1-3.el9.x86_64.rp 1.1 MB/s | 308 kB 00:00 2026-03-25T15:26:59.409 INFO:teuthology.orchestra.run.vm04.stdout:(98/138): libarrow-doc-9.0.0-15.el9.noarch.rpm 352 kB/s | 25 kB 00:00 2026-03-25T15:26:59.411 INFO:teuthology.orchestra.run.vm04.stdout:(99/138): liboath-2.6.12-1.el9.x86_64.rpm 702 kB/s | 49 kB 00:00 2026-03-25T15:26:59.474 INFO:teuthology.orchestra.run.vm04.stdout:(100/138): libunwind-1.6.2-1.el9.x86_64.rpm 1.0 MB/s | 67 kB 00:00 2026-03-25T15:26:59.480 INFO:teuthology.orchestra.run.vm04.stdout:(101/138): luarocks-3.9.2-5.el9.noarch.rpm 2.1 MB/s | 151 kB 00:00 2026-03-25T15:26:59.516 INFO:teuthology.orchestra.run.vm04.stdout:(102/138): libarrow-9.0.0-15.el9.x86_64.rpm 19 MB/s | 4.4 MB 00:00 2026-03-25T15:26:59.546 INFO:teuthology.orchestra.run.vm04.stdout:(103/138): parquet-libs-9.0.0-15.el9.x86_64.rpm 11 MB/s | 838 kB 00:00 2026-03-25T15:26:59.548 INFO:teuthology.orchestra.run.vm04.stdout:(104/138): python3-asyncssh-2.13.2-5.el9.noarch 7.9 MB/s | 548 kB 00:00 2026-03-25T15:26:59.575 INFO:teuthology.orchestra.run.vm04.stdout:(105/138): python3-autocommand-2.2.2-8.el9.noar 500 kB/s | 29 kB 00:00 2026-03-25T15:26:59.606 INFO:teuthology.orchestra.run.vm04.stdout:(106/138): python3-backports-tarfile-1.2.0-1.el 1.0 MB/s | 60 kB 00:00 2026-03-25T15:26:59.608 INFO:teuthology.orchestra.run.vm04.stdout:(107/138): python3-bcrypt-3.2.2-1.el9.x86_64.rp 729 kB/s | 43 kB 00:00 2026-03-25T15:26:59.634 INFO:teuthology.orchestra.run.vm04.stdout:(108/138): python3-cachetools-4.2.4-1.el9.noarc 543 kB/s | 32 kB 00:00 2026-03-25T15:26:59.666 INFO:teuthology.orchestra.run.vm04.stdout:(109/138): python3-certifi-2023.05.07-4.el9.noa 237 kB/s | 14 kB 00:00 2026-03-25T15:26:59.669 INFO:teuthology.orchestra.run.vm04.stdout:(110/138): python3-cheroot-10.0.1-4.el9.noarch. 2.8 MB/s | 173 kB 00:00 2026-03-25T15:26:59.698 INFO:teuthology.orchestra.run.vm04.stdout:(111/138): python3-cherrypy-18.6.1-2.el9.noarch 5.5 MB/s | 358 kB 00:00 2026-03-25T15:26:59.728 INFO:teuthology.orchestra.run.vm04.stdout:(112/138): python3-google-auth-2.45.0-1.el9.noa 4.0 MB/s | 254 kB 00:00 2026-03-25T15:26:59.753 INFO:teuthology.orchestra.run.vm04.stdout:(113/138): python3-grpcio-1.46.7-10.el9.x86_64. 24 MB/s | 2.0 MB 00:00 2026-03-25T15:26:59.759 INFO:teuthology.orchestra.run.vm04.stdout:(114/138): python3-grpcio-tools-1.46.7-10.el9.x 2.3 MB/s | 144 kB 00:00 2026-03-25T15:26:59.787 INFO:teuthology.orchestra.run.vm04.stdout:(115/138): python3-jaraco-8.2.1-3.el9.noarch.rp 182 kB/s | 11 kB 00:00 2026-03-25T15:26:59.813 INFO:teuthology.orchestra.run.vm04.stdout:(116/138): python3-jaraco-classes-3.2.1-5.el9.n 299 kB/s | 18 kB 00:00 2026-03-25T15:26:59.819 INFO:teuthology.orchestra.run.vm04.stdout:(117/138): python3-jaraco-collections-3.0.0-8.e 390 kB/s | 23 kB 00:00 2026-03-25T15:26:59.846 INFO:teuthology.orchestra.run.vm04.stdout:(118/138): python3-jaraco-context-6.0.1-3.el9.n 330 kB/s | 20 kB 00:00 2026-03-25T15:26:59.873 INFO:teuthology.orchestra.run.vm04.stdout:(119/138): python3-jaraco-functools-3.5.0-2.el9 325 kB/s | 19 kB 00:00 2026-03-25T15:26:59.879 INFO:teuthology.orchestra.run.vm04.stdout:(120/138): python3-jaraco-text-4.0.0-2.el9.noar 443 kB/s | 26 kB 00:00 2026-03-25T15:26:59.921 INFO:teuthology.orchestra.run.vm04.stdout:(121/138): python3-kubernetes-26.1.0-3.el9.noar 14 MB/s | 1.0 MB 00:00 2026-03-25T15:26:59.933 INFO:teuthology.orchestra.run.vm04.stdout:(122/138): python3-more-itertools-8.12.0-2.el9. 1.3 MB/s | 79 kB 00:00 2026-03-25T15:26:59.940 INFO:teuthology.orchestra.run.vm04.stdout:(123/138): python3-natsort-7.1.1-5.el9.noarch.r 956 kB/s | 58 kB 00:00 2026-03-25T15:26:59.981 INFO:teuthology.orchestra.run.vm04.stdout:(124/138): python3-portend-3.1.0-2.el9.noarch.r 275 kB/s | 16 kB 00:00 2026-03-25T15:26:59.994 INFO:teuthology.orchestra.run.vm04.stdout:(125/138): python3-pyOpenSSL-21.0.0-1.el9.noarc 1.5 MB/s | 90 kB 00:00 2026-03-25T15:26:59.999 INFO:teuthology.orchestra.run.vm04.stdout:(126/138): python3-repoze-lru-0.7-16.el9.noarch 521 kB/s | 31 kB 00:00 2026-03-25T15:27:00.043 INFO:teuthology.orchestra.run.vm04.stdout:(127/138): python3-routes-2.5.1-5.el9.noarch.rp 3.0 MB/s | 188 kB 00:00 2026-03-25T15:27:00.054 INFO:teuthology.orchestra.run.vm04.stdout:(128/138): python3-rsa-4.9-2.el9.noarch.rpm 993 kB/s | 59 kB 00:00 2026-03-25T15:27:00.058 INFO:teuthology.orchestra.run.vm04.stdout:(129/138): python3-tempora-5.0.0-2.el9.noarch.r 601 kB/s | 36 kB 00:00 2026-03-25T15:27:00.104 INFO:teuthology.orchestra.run.vm04.stdout:(130/138): python3-typing-extensions-4.15.0-1.e 1.4 MB/s | 86 kB 00:00 2026-03-25T15:27:00.115 INFO:teuthology.orchestra.run.vm04.stdout:(131/138): python3-websocket-client-1.2.3-2.el9 1.4 MB/s | 90 kB 00:00 2026-03-25T15:27:00.119 INFO:teuthology.orchestra.run.vm04.stdout:(132/138): python3-xmltodict-0.12.0-15.el9.noar 370 kB/s | 22 kB 00:00 2026-03-25T15:27:00.164 INFO:teuthology.orchestra.run.vm04.stdout:(133/138): python3-zc-lockfile-2.0-10.el9.noarc 337 kB/s | 20 kB 00:00 2026-03-25T15:27:00.179 INFO:teuthology.orchestra.run.vm04.stdout:(134/138): re2-20211101-20.el9.x86_64.rpm 2.9 MB/s | 191 kB 00:00 2026-03-25T15:27:00.181 INFO:teuthology.orchestra.run.vm04.stdout:(135/138): s3cmd-2.4.0-1.el9.noarch.rpm 3.2 MB/s | 206 kB 00:00 2026-03-25T15:27:00.244 INFO:teuthology.orchestra.run.vm04.stdout:(136/138): thrift-0.15.0-4.el9.x86_64.rpm 20 MB/s | 1.6 MB 00:00 2026-03-25T15:27:01.483 INFO:teuthology.orchestra.run.vm04.stdout:(137/138): librbd1-20.2.0-712.g70f8415b.el9.x86 2.2 MB/s | 2.8 MB 00:01 2026-03-25T15:27:02.065 INFO:teuthology.orchestra.run.vm04.stdout:(138/138): librados2-20.2.0-712.g70f8415b.el9.x 1.9 MB/s | 3.5 MB 00:01 2026-03-25T15:27:02.068 INFO:teuthology.orchestra.run.vm04.stdout:-------------------------------------------------------------------------------- 2026-03-25T15:27:02.073 INFO:teuthology.orchestra.run.vm04.stdout:Total 13 MB/s | 267 MB 00:19 2026-03-25T15:27:02.868 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction check 2026-03-25T15:27:02.927 INFO:teuthology.orchestra.run.vm04.stdout:Transaction check succeeded. 2026-03-25T15:27:02.927 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction test 2026-03-25T15:27:03.962 INFO:teuthology.orchestra.run.vm04.stdout:Transaction test succeeded. 2026-03-25T15:27:03.962 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction 2026-03-25T15:27:05.626 INFO:teuthology.orchestra.run.vm04.stdout: Preparing : 1/1 2026-03-25T15:27:05.636 INFO:teuthology.orchestra.run.vm04.stdout: Installing : thrift-0.15.0-4.el9.x86_64 1/140 2026-03-25T15:27:05.640 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-more-itertools-8.12.0-2.el9.noarch 2/140 2026-03-25T15:27:05.655 INFO:teuthology.orchestra.run.vm04.stdout: Installing : liboath-2.6.12-1.el9.x86_64 3/140 2026-03-25T15:27:05.869 INFO:teuthology.orchestra.run.vm04.stdout: Installing : lttng-ust-2.12.0-6.el9.x86_64 4/140 2026-03-25T15:27:05.871 INFO:teuthology.orchestra.run.vm04.stdout: Upgrading : librados2-2:20.2.0-712.g70f8415b.el9.x86_64 5/140 2026-03-25T15:27:05.909 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: librados2-2:20.2.0-712.g70f8415b.el9.x86_64 5/140 2026-03-25T15:27:05.920 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-rados-2:20.2.0-712.g70f8415b.el9.x86_64 6/140 2026-03-25T15:27:05.924 INFO:teuthology.orchestra.run.vm04.stdout: Installing : librdkafka-1.6.1-102.el9.x86_64 7/140 2026-03-25T15:27:05.929 INFO:teuthology.orchestra.run.vm04.stdout: Installing : librabbitmq-0.11.0-7.el9.x86_64 8/140 2026-03-25T15:27:05.932 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libpmemobj-1.12.1-1.el9.x86_64 9/140 2026-03-25T15:27:05.938 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-jaraco-8.2.1-3.el9.noarch 10/140 2026-03-25T15:27:06.191 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libnbd-1.20.3-4.el9.x86_64 11/140 2026-03-25T15:27:06.193 INFO:teuthology.orchestra.run.vm04.stdout: Upgrading : librbd1-2:20.2.0-712.g70f8415b.el9.x86_64 12/140 2026-03-25T15:27:06.216 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: librbd1-2:20.2.0-712.g70f8415b.el9.x86_64 12/140 2026-03-25T15:27:06.217 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libcephsqlite-2:20.2.0-712.g70f8415b.el9.x86_64 13/140 2026-03-25T15:27:06.246 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: libcephsqlite-2:20.2.0-712.g70f8415b.el9.x86_64 13/140 2026-03-25T15:27:06.247 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libradosstriper1-2:20.2.0-712.g70f8415b.el9.x86_ 14/140 2026-03-25T15:27:06.272 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: libradosstriper1-2:20.2.0-712.g70f8415b.el9.x86_ 14/140 2026-03-25T15:27:06.320 INFO:teuthology.orchestra.run.vm04.stdout: Installing : re2-1:20211101-20.el9.x86_64 15/140 2026-03-25T15:27:06.348 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libarrow-9.0.0-15.el9.x86_64 16/140 2026-03-25T15:27:06.360 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-pyasn1-0.4.8-7.el9.noarch 17/140 2026-03-25T15:27:06.368 INFO:teuthology.orchestra.run.vm04.stdout: Installing : protobuf-3.14.0-17.el9.x86_64 18/140 2026-03-25T15:27:06.371 INFO:teuthology.orchestra.run.vm04.stdout: Installing : lua-5.4.4-4.el9.x86_64 19/140 2026-03-25T15:27:06.377 INFO:teuthology.orchestra.run.vm04.stdout: Installing : flexiblas-3.0.4-9.el9.x86_64 20/140 2026-03-25T15:27:06.406 INFO:teuthology.orchestra.run.vm04.stdout: Installing : unzip-6.0-59.el9.x86_64 21/140 2026-03-25T15:27:06.424 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-urllib3-1.26.5-7.el9.noarch 22/140 2026-03-25T15:27:06.429 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-requests-2.25.1-10.el9.noarch 23/140 2026-03-25T15:27:06.438 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libquadmath-11.5.0-14.el9.x86_64 24/140 2026-03-25T15:27:06.441 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libgfortran-11.5.0-14.el9.x86_64 25/140 2026-03-25T15:27:06.487 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ledmon-libs-1.1.0-3.el9.x86_64 26/140 2026-03-25T15:27:06.695 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-ceph-common-2:20.2.0-712.g70f8415b.el9.x 27/140 2026-03-25T15:27:06.712 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-ceph-argparse-2:20.2.0-712.g70f8415b.el9 28/140 2026-03-25T15:27:06.713 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libcephfs-proxy2-2:20.2.0-712.g70f8415b.el9.x86_ 29/140 2026-03-25T15:27:06.770 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: libcephfs-proxy2-2:20.2.0-712.g70f8415b.el9.x86_ 29/140 2026-03-25T15:27:06.772 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libcephfs2-2:20.2.0-712.g70f8415b.el9.x86_64 30/140 2026-03-25T15:27:06.795 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: libcephfs2-2:20.2.0-712.g70f8415b.el9.x86_64 30/140 2026-03-25T15:27:06.812 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-cephfs-2:20.2.0-712.g70f8415b.el9.x86_64 31/140 2026-03-25T15:27:06.822 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-requests-oauthlib-1.3.0-12.el9.noarch 32/140 2026-03-25T15:27:06.859 INFO:teuthology.orchestra.run.vm04.stdout: Installing : zip-3.0-35.el9.x86_64 33/140 2026-03-25T15:27:06.864 INFO:teuthology.orchestra.run.vm04.stdout: Installing : luarocks-3.9.2-5.el9.noarch 34/140 2026-03-25T15:27:06.874 INFO:teuthology.orchestra.run.vm04.stdout: Installing : lua-devel-5.4.4-4.el9.x86_64 35/140 2026-03-25T15:27:06.935 INFO:teuthology.orchestra.run.vm04.stdout: Installing : protobuf-compiler-3.14.0-17.el9.x86_64 36/140 2026-03-25T15:27:06.951 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-pyasn1-modules-0.4.8-7.el9.noarch 37/140 2026-03-25T15:27:06.970 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-rsa-4.9-2.el9.noarch 38/140 2026-03-25T15:27:06.976 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-rbd-2:20.2.0-712.g70f8415b.el9.x86_64 39/140 2026-03-25T15:27:06.986 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-jaraco-classes-3.2.1-5.el9.noarch 40/140 2026-03-25T15:27:06.992 INFO:teuthology.orchestra.run.vm04.stdout: Installing : librados-devel-2:20.2.0-712.g70f8415b.el9.x86_64 41/140 2026-03-25T15:27:06.997 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-zc-lockfile-2.0-10.el9.noarch 42/140 2026-03-25T15:27:07.016 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-xmltodict-0.12.0-15.el9.noarch 43/140 2026-03-25T15:27:07.024 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-websocket-client-1.2.3-2.el9.noarch 44/140 2026-03-25T15:27:07.033 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-typing-extensions-4.15.0-1.el9.noarch 45/140 2026-03-25T15:27:07.048 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-repoze-lru-0.7-16.el9.noarch 46/140 2026-03-25T15:27:07.061 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-routes-2.5.1-5.el9.noarch 47/140 2026-03-25T15:27:07.070 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-natsort-7.1.1-5.el9.noarch 48/140 2026-03-25T15:27:07.085 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-certifi-2023.05.07-4.el9.noarch 49/140 2026-03-25T15:27:07.135 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-cachetools-4.2.4-1.el9.noarch 50/140 2026-03-25T15:27:07.541 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-google-auth-1:2.45.0-1.el9.noarch 51/140 2026-03-25T15:27:07.563 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-kubernetes-1:26.1.0-3.el9.noarch 52/140 2026-03-25T15:27:07.568 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-backports-tarfile-1.2.0-1.el9.noarch 53/140 2026-03-25T15:27:07.575 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-jaraco-context-6.0.1-3.el9.noarch 54/140 2026-03-25T15:27:07.580 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-autocommand-2.2.2-8.el9.noarch 55/140 2026-03-25T15:27:07.587 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libunwind-1.6.2-1.el9.x86_64 56/140 2026-03-25T15:27:07.590 INFO:teuthology.orchestra.run.vm04.stdout: Installing : gperftools-libs-2.9.1-3.el9.x86_64 57/140 2026-03-25T15:27:07.592 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libarrow-doc-9.0.0-15.el9.noarch 58/140 2026-03-25T15:27:07.626 INFO:teuthology.orchestra.run.vm04.stdout: Installing : grpc-data-1.46.7-10.el9.noarch 59/140 2026-03-25T15:27:07.690 INFO:teuthology.orchestra.run.vm04.stdout: Installing : abseil-cpp-20211102.0-4.el9.x86_64 60/140 2026-03-25T15:27:07.702 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-grpcio-1.46.7-10.el9.x86_64 61/140 2026-03-25T15:27:07.712 INFO:teuthology.orchestra.run.vm04.stdout: Installing : socat-1.7.4.1-8.el9.x86_64 62/140 2026-03-25T15:27:07.721 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-toml-0.10.2-6.el9.noarch 63/140 2026-03-25T15:27:07.728 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-jaraco-functools-3.5.0-2.el9.noarch 64/140 2026-03-25T15:27:07.734 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-jaraco-text-4.0.0-2.el9.noarch 65/140 2026-03-25T15:27:07.743 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-jaraco-collections-3.0.0-8.el9.noarch 66/140 2026-03-25T15:27:07.748 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-tempora-5.0.0-2.el9.noarch 67/140 2026-03-25T15:27:07.790 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-portend-3.1.0-2.el9.noarch 68/140 2026-03-25T15:27:07.805 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-protobuf-3.14.0-17.el9.noarch 69/140 2026-03-25T15:27:07.813 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-grpcio-tools-1.46.7-10.el9.x86_64 70/140 2026-03-25T15:27:07.828 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-markupsafe-1.1.1-12.el9.x86_64 71/140 2026-03-25T15:27:07.876 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-jmespath-1.0.1-1.el9.noarch 72/140 2026-03-25T15:27:08.193 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-devel-3.9.25-3.el9.x86_64 73/140 2026-03-25T15:27:08.230 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-babel-2.9.1-2.el9.noarch 74/140 2026-03-25T15:27:08.235 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-jinja2-2.11.3-8.el9.noarch 75/140 2026-03-25T15:27:08.240 INFO:teuthology.orchestra.run.vm04.stdout: Installing : perl-Benchmark-1.23-483.el9.noarch 76/140 2026-03-25T15:27:08.319 INFO:teuthology.orchestra.run.vm04.stdout: Installing : openblas-0.3.29-1.el9.x86_64 77/140 2026-03-25T15:27:08.341 INFO:teuthology.orchestra.run.vm04.stdout: Installing : openblas-openmp-0.3.29-1.el9.x86_64 78/140 2026-03-25T15:27:08.371 INFO:teuthology.orchestra.run.vm04.stdout: Installing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 79/140 2026-03-25T15:27:08.850 INFO:teuthology.orchestra.run.vm04.stdout: Installing : flexiblas-netlib-3.0.4-9.el9.x86_64 80/140 2026-03-25T15:27:08.955 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-numpy-1:1.23.5-2.el9.x86_64 81/140 2026-03-25T15:27:09.831 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 82/140 2026-03-25T15:27:09.859 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-scipy-1.9.3-2.el9.x86_64 83/140 2026-03-25T15:27:09.865 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libxslt-1.1.34-12.el9.x86_64 84/140 2026-03-25T15:27:09.869 INFO:teuthology.orchestra.run.vm04.stdout: Installing : xmlstarlet-1.6.1-20.el9.x86_64 85/140 2026-03-25T15:27:09.876 INFO:teuthology.orchestra.run.vm04.stdout: Installing : boost-program-options-1.75.0-13.el9.x86_64 86/140 2026-03-25T15:27:10.198 INFO:teuthology.orchestra.run.vm04.stdout: Installing : parquet-libs-9.0.0-15.el9.x86_64 87/140 2026-03-25T15:27:10.468 INFO:teuthology.orchestra.run.vm04.stdout: Installing : librgw2-2:20.2.0-712.g70f8415b.el9.x86_64 88/140 2026-03-25T15:27:10.665 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: librgw2-2:20.2.0-712.g70f8415b.el9.x86_64 88/140 2026-03-25T15:27:10.690 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-rgw-2:20.2.0-712.g70f8415b.el9.x86_64 89/140 2026-03-25T15:27:12.134 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-common-2:20.2.0-712.g70f8415b.el9.x86_64 90/140 2026-03-25T15:27:12.240 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-common-2:20.2.0-712.g70f8415b.el9.x86_64 90/140 2026-03-25T15:27:12.268 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-common-2:20.2.0-712.g70f8415b.el9.x86_64 90/140 2026-03-25T15:27:12.285 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-pyparsing-2.4.7-9.el9.noarch 91/140 2026-03-25T15:27:12.382 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-packaging-20.9-5.el9.noarch 92/140 2026-03-25T15:27:12.543 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-ply-3.11-14.el9.noarch 93/140 2026-03-25T15:27:12.747 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-pycparser-2.20-6.el9.noarch 94/140 2026-03-25T15:27:13.244 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-cffi-1.14.5-5.el9.x86_64 95/140 2026-03-25T15:27:13.313 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-cryptography-36.0.1-5.el9.x86_64 96/140 2026-03-25T15:27:13.544 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-pyOpenSSL-21.0.0-1.el9.noarch 97/140 2026-03-25T15:27:13.685 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-cheroot-10.0.1-4.el9.noarch 98/140 2026-03-25T15:27:13.842 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-cherrypy-18.6.1-2.el9.noarch 99/140 2026-03-25T15:27:13.862 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-asyncssh-2.13.2-5.el9.noarch 100/140 2026-03-25T15:27:13.873 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-bcrypt-3.2.2-1.el9.x86_64 101/140 2026-03-25T15:27:13.883 INFO:teuthology.orchestra.run.vm04.stdout: Installing : pciutils-3.7.0-7.el9.x86_64 102/140 2026-03-25T15:27:13.895 INFO:teuthology.orchestra.run.vm04.stdout: Installing : qatlib-25.08.0-2.el9.x86_64 103/140 2026-03-25T15:27:13.897 INFO:teuthology.orchestra.run.vm04.stdout: Installing : qatlib-service-25.08.0-2.el9.x86_64 104/140 2026-03-25T15:27:13.923 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: qatlib-service-25.08.0-2.el9.x86_64 104/140 2026-03-25T15:27:14.311 INFO:teuthology.orchestra.run.vm04.stdout: Installing : qatzip-libs-1.3.1-1.el9.x86_64 105/140 2026-03-25T15:27:14.352 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-base-2:20.2.0-712.g70f8415b.el9.x86_64 106/140 2026-03-25T15:27:14.405 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-base-2:20.2.0-712.g70f8415b.el9.x86_64 106/140 2026-03-25T15:27:14.405 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph.target → /usr/lib/systemd/system/ceph.target. 2026-03-25T15:27:14.405 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-crash.service → /usr/lib/systemd/system/ceph-crash.service. 2026-03-25T15:27:14.405 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:14.411 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-selinux-2:20.2.0-712.g70f8415b.el9.x86_64 107/140 2026-03-25T15:27:21.640 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-selinux-2:20.2.0-712.g70f8415b.el9.x86_64 107/140 2026-03-25T15:27:21.640 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /sys 2026-03-25T15:27:21.640 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /proc 2026-03-25T15:27:21.640 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /mnt 2026-03-25T15:27:21.640 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /var/tmp 2026-03-25T15:27:21.640 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /home 2026-03-25T15:27:21.640 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /root 2026-03-25T15:27:21.640 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /tmp 2026-03-25T15:27:21.640 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:21.767 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-mds-2:20.2.0-712.g70f8415b.el9.x86_64 108/140 2026-03-25T15:27:21.794 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mds-2:20.2.0-712.g70f8415b.el9.x86_64 108/140 2026-03-25T15:27:21.794 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-25T15:27:21.794 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-25T15:27:21.794 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-25T15:27:21.794 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-25T15:27:21.794 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:22.071 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-mon-2:20.2.0-712.g70f8415b.el9.x86_64 109/140 2026-03-25T15:27:22.099 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mon-2:20.2.0-712.g70f8415b.el9.x86_64 109/140 2026-03-25T15:27:22.099 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-25T15:27:22.099 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-25T15:27:22.099 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-25T15:27:22.099 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-25T15:27:22.099 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:22.108 INFO:teuthology.orchestra.run.vm04.stdout: Installing : mailcap-2.1.49-5.el9.noarch 110/140 2026-03-25T15:27:22.111 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libconfig-1.7.2-9.el9.x86_64 111/140 2026-03-25T15:27:22.134 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 112/140 2026-03-25T15:27:22.134 INFO:teuthology.orchestra.run.vm04.stdout:Creating group 'qat' with GID 994. 2026-03-25T15:27:22.134 INFO:teuthology.orchestra.run.vm04.stdout:Creating group 'libstoragemgmt' with GID 993. 2026-03-25T15:27:22.134 INFO:teuthology.orchestra.run.vm04.stdout:Creating user 'libstoragemgmt' (daemon account for libstoragemgmt) with UID 993 and GID 993. 2026-03-25T15:27:22.134 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:22.169 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libstoragemgmt-1.10.1-1.el9.x86_64 112/140 2026-03-25T15:27:22.201 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 112/140 2026-03-25T15:27:22.201 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/libstoragemgmt.service → /usr/lib/systemd/system/libstoragemgmt.service. 2026-03-25T15:27:22.201 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:22.231 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 113/140 2026-03-25T15:27:22.266 INFO:teuthology.orchestra.run.vm04.stdout: Installing : fuse-2.9.9-17.el9.x86_64 114/140 2026-03-25T15:27:22.353 INFO:teuthology.orchestra.run.vm04.stdout: Installing : cryptsetup-2.8.1-3.el9.x86_64 115/140 2026-03-25T15:27:22.378 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-volume-2:20.2.0-712.g70f8415b.el9.noarch 116/140 2026-03-25T15:27:22.394 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-volume-2:20.2.0-712.g70f8415b.el9.noarch 116/140 2026-03-25T15:27:22.394 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-25T15:27:22.394 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-volume@*.service" escaped as "ceph-volume@\x2a.service". 2026-03-25T15:27:22.394 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:23.265 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-osd-2:20.2.0-712.g70f8415b.el9.x86_64 117/140 2026-03-25T15:27:23.295 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-osd-2:20.2.0-712.g70f8415b.el9.x86_64 117/140 2026-03-25T15:27:23.295 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-25T15:27:23.295 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-25T15:27:23.295 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-25T15:27:23.295 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-25T15:27:23.295 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:23.382 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: cephadm-2:20.2.0-712.g70f8415b.el9.noarch 118/140 2026-03-25T15:27:23.385 INFO:teuthology.orchestra.run.vm04.stdout: Installing : cephadm-2:20.2.0-712.g70f8415b.el9.noarch 118/140 2026-03-25T15:27:23.397 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-prometheus-alerts-2:20.2.0-712.g70f8415b.el 119/140 2026-03-25T15:27:23.431 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-grafana-dashboards-2:20.2.0-712.g70f8415b.e 120/140 2026-03-25T15:27:23.434 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-mgr-cephadm-2:20.2.0-712.g70f8415b.el9.noar 121/140 2026-03-25T15:27:24.879 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mgr-cephadm-2:20.2.0-712.g70f8415b.el9.noar 121/140 2026-03-25T15:27:25.213 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-mgr-dashboard-2:20.2.0-712.g70f8415b.el9.no 122/140 2026-03-25T15:27:25.928 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mgr-dashboard-2:20.2.0-712.g70f8415b.el9.no 122/140 2026-03-25T15:27:25.931 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-mgr-diskprediction-local-2:20.2.0-712.g70f8 123/140 2026-03-25T15:27:26.022 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:20.2.0-712.g70f8 123/140 2026-03-25T15:27:26.125 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-mgr-modules-core-2:20.2.0-712.g70f8415b.el9 124/140 2026-03-25T15:27:26.130 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-mgr-2:20.2.0-712.g70f8415b.el9.x86_64 125/140 2026-03-25T15:27:26.160 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mgr-2:20.2.0-712.g70f8415b.el9.x86_64 125/140 2026-03-25T15:27:26.161 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-25T15:27:26.161 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-25T15:27:26.161 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-25T15:27:26.161 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-25T15:27:26.161 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:26.203 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-mgr-rook-2:20.2.0-712.g70f8415b.el9.noarch 126/140 2026-03-25T15:27:26.217 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mgr-rook-2:20.2.0-712.g70f8415b.el9.noarch 126/140 2026-03-25T15:27:26.282 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-2:20.2.0-712.g70f8415b.el9.x86_64 127/140 2026-03-25T15:27:27.822 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-fuse-2:20.2.0-712.g70f8415b.el9.x86_64 128/140 2026-03-25T15:27:27.827 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-radosgw-2:20.2.0-712.g70f8415b.el9.x86_64 129/140 2026-03-25T15:27:27.855 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-radosgw-2:20.2.0-712.g70f8415b.el9.x86_64 129/140 2026-03-25T15:27:27.855 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-25T15:27:27.855 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-25T15:27:27.855 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-25T15:27:27.855 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-25T15:27:27.855 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:27.869 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-immutable-object-cache-2:20.2.0-712.g70f841 130/140 2026-03-25T15:27:27.894 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-immutable-object-cache-2:20.2.0-712.g70f841 130/140 2026-03-25T15:27:27.894 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-25T15:27:27.894 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-25T15:27:27.894 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:28.077 INFO:teuthology.orchestra.run.vm04.stdout: Installing : rbd-mirror-2:20.2.0-712.g70f8415b.el9.x86_64 131/140 2026-03-25T15:27:28.102 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: rbd-mirror-2:20.2.0-712.g70f8415b.el9.x86_64 131/140 2026-03-25T15:27:28.102 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-25T15:27:28.102 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-25T15:27:28.102 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-25T15:27:28.102 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-25T15:27:28.102 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:33.728 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-test-2:20.2.0-712.g70f8415b.el9.x86_64 132/140 2026-03-25T15:27:33.803 INFO:teuthology.orchestra.run.vm04.stdout: Installing : perl-Test-Harness-1:3.42-461.el9.noarch 133/140 2026-03-25T15:27:33.816 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libcephfs-devel-2:20.2.0-712.g70f8415b.el9.x86_6 134/140 2026-03-25T15:27:33.846 INFO:teuthology.orchestra.run.vm04.stdout: Installing : rbd-fuse-2:20.2.0-712.g70f8415b.el9.x86_64 135/140 2026-03-25T15:27:33.926 INFO:teuthology.orchestra.run.vm04.stdout: Installing : rbd-nbd-2:20.2.0-712.g70f8415b.el9.x86_64 136/140 2026-03-25T15:27:33.972 INFO:teuthology.orchestra.run.vm04.stdout: Installing : s3cmd-2.4.0-1.el9.noarch 137/140 2026-03-25T15:27:34.027 INFO:teuthology.orchestra.run.vm04.stdout: Installing : bzip2-1.0.8-11.el9.x86_64 138/140 2026-03-25T15:27:34.027 INFO:teuthology.orchestra.run.vm04.stdout: Cleanup : librbd1-2:16.2.4-5.el9.x86_64 139/140 2026-03-25T15:27:34.080 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: librbd1-2:16.2.4-5.el9.x86_64 139/140 2026-03-25T15:27:34.080 INFO:teuthology.orchestra.run.vm04.stdout: Cleanup : librados2-2:16.2.4-5.el9.x86_64 140/140 2026-03-25T15:27:36.967 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: librados2-2:16.2.4-5.el9.x86_64 140/140 2026-03-25T15:27:36.967 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-2:20.2.0-712.g70f8415b.el9.x86_64 1/140 2026-03-25T15:27:36.967 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-base-2:20.2.0-712.g70f8415b.el9.x86_64 2/140 2026-03-25T15:27:36.967 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-common-2:20.2.0-712.g70f8415b.el9.x86_64 3/140 2026-03-25T15:27:36.967 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-fuse-2:20.2.0-712.g70f8415b.el9.x86_64 4/140 2026-03-25T15:27:36.967 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-immutable-object-cache-2:20.2.0-712.g70f841 5/140 2026-03-25T15:27:36.967 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mds-2:20.2.0-712.g70f8415b.el9.x86_64 6/140 2026-03-25T15:27:36.967 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-2:20.2.0-712.g70f8415b.el9.x86_64 7/140 2026-03-25T15:27:36.967 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mon-2:20.2.0-712.g70f8415b.el9.x86_64 8/140 2026-03-25T15:27:36.968 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-osd-2:20.2.0-712.g70f8415b.el9.x86_64 9/140 2026-03-25T15:27:36.968 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-radosgw-2:20.2.0-712.g70f8415b.el9.x86_64 10/140 2026-03-25T15:27:36.968 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-selinux-2:20.2.0-712.g70f8415b.el9.x86_64 11/140 2026-03-25T15:27:36.968 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-test-2:20.2.0-712.g70f8415b.el9.x86_64 12/140 2026-03-25T15:27:36.968 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libcephfs-devel-2:20.2.0-712.g70f8415b.el9.x86_6 13/140 2026-03-25T15:27:36.968 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libcephfs-proxy2-2:20.2.0-712.g70f8415b.el9.x86_ 14/140 2026-03-25T15:27:36.968 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libcephfs2-2:20.2.0-712.g70f8415b.el9.x86_64 15/140 2026-03-25T15:27:36.968 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libcephsqlite-2:20.2.0-712.g70f8415b.el9.x86_64 16/140 2026-03-25T15:27:36.968 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librados-devel-2:20.2.0-712.g70f8415b.el9.x86_64 17/140 2026-03-25T15:27:36.968 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libradosstriper1-2:20.2.0-712.g70f8415b.el9.x86_ 18/140 2026-03-25T15:27:36.968 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librgw2-2:20.2.0-712.g70f8415b.el9.x86_64 19/140 2026-03-25T15:27:36.968 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-ceph-argparse-2:20.2.0-712.g70f8415b.el9 20/140 2026-03-25T15:27:36.968 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-ceph-common-2:20.2.0-712.g70f8415b.el9.x 21/140 2026-03-25T15:27:36.968 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cephfs-2:20.2.0-712.g70f8415b.el9.x86_64 22/140 2026-03-25T15:27:36.968 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-rados-2:20.2.0-712.g70f8415b.el9.x86_64 23/140 2026-03-25T15:27:36.968 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-rbd-2:20.2.0-712.g70f8415b.el9.x86_64 24/140 2026-03-25T15:27:36.968 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-rgw-2:20.2.0-712.g70f8415b.el9.x86_64 25/140 2026-03-25T15:27:36.969 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : rbd-fuse-2:20.2.0-712.g70f8415b.el9.x86_64 26/140 2026-03-25T15:27:36.969 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : rbd-mirror-2:20.2.0-712.g70f8415b.el9.x86_64 27/140 2026-03-25T15:27:36.970 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : rbd-nbd-2:20.2.0-712.g70f8415b.el9.x86_64 28/140 2026-03-25T15:27:36.970 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-grafana-dashboards-2:20.2.0-712.g70f8415b.e 29/140 2026-03-25T15:27:36.970 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-cephadm-2:20.2.0-712.g70f8415b.el9.noar 30/140 2026-03-25T15:27:36.970 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-dashboard-2:20.2.0-712.g70f8415b.el9.no 31/140 2026-03-25T15:27:36.970 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-diskprediction-local-2:20.2.0-712.g70f8 32/140 2026-03-25T15:27:36.970 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-modules-core-2:20.2.0-712.g70f8415b.el9 33/140 2026-03-25T15:27:36.970 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-rook-2:20.2.0-712.g70f8415b.el9.noarch 34/140 2026-03-25T15:27:36.970 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-prometheus-alerts-2:20.2.0-712.g70f8415b.el 35/140 2026-03-25T15:27:36.970 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-volume-2:20.2.0-712.g70f8415b.el9.noarch 36/140 2026-03-25T15:27:36.970 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : cephadm-2:20.2.0-712.g70f8415b.el9.noarch 37/140 2026-03-25T15:27:36.970 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : bzip2-1.0.8-11.el9.x86_64 38/140 2026-03-25T15:27:36.970 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : cryptsetup-2.8.1-3.el9.x86_64 39/140 2026-03-25T15:27:36.970 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : fuse-2.9.9-17.el9.x86_64 40/140 2026-03-25T15:27:36.970 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 41/140 2026-03-25T15:27:36.970 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 42/140 2026-03-25T15:27:36.970 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 43/140 2026-03-25T15:27:36.970 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 44/140 2026-03-25T15:27:36.970 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 45/140 2026-03-25T15:27:36.970 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : pciutils-3.7.0-7.el9.x86_64 46/140 2026-03-25T15:27:36.970 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 47/140 2026-03-25T15:27:36.970 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 48/140 2026-03-25T15:27:36.970 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-ply-3.11-14.el9.noarch 49/140 2026-03-25T15:27:36.970 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 50/140 2026-03-25T15:27:36.970 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pyparsing-2.4.7-9.el9.noarch 51/140 2026-03-25T15:27:36.970 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 52/140 2026-03-25T15:27:36.970 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 53/140 2026-03-25T15:27:36.970 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : unzip-6.0-59.el9.x86_64 54/140 2026-03-25T15:27:36.971 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : zip-3.0-35.el9.x86_64 55/140 2026-03-25T15:27:36.971 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 56/140 2026-03-25T15:27:36.971 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 57/140 2026-03-25T15:27:36.971 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 58/140 2026-03-25T15:27:36.971 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 59/140 2026-03-25T15:27:36.971 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libnbd-1.20.3-4.el9.x86_64 60/140 2026-03-25T15:27:36.971 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 61/140 2026-03-25T15:27:36.971 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 62/140 2026-03-25T15:27:36.972 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 63/140 2026-03-25T15:27:36.972 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 64/140 2026-03-25T15:27:36.972 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 65/140 2026-03-25T15:27:36.972 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 66/140 2026-03-25T15:27:36.972 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : lua-5.4.4-4.el9.x86_64 67/140 2026-03-25T15:27:36.972 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 68/140 2026-03-25T15:27:36.972 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 69/140 2026-03-25T15:27:36.972 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : perl-Benchmark-1.23-483.el9.noarch 70/140 2026-03-25T15:27:36.972 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : perl-Test-Harness-1:3.42-461.el9.noarch 71/140 2026-03-25T15:27:36.972 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : protobuf-3.14.0-17.el9.x86_64 72/140 2026-03-25T15:27:36.972 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 73/140 2026-03-25T15:27:36.972 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 74/140 2026-03-25T15:27:36.972 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 75/140 2026-03-25T15:27:36.972 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jmespath-1.0.1-1.el9.noarch 76/140 2026-03-25T15:27:36.972 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 77/140 2026-03-25T15:27:36.972 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 78/140 2026-03-25T15:27:36.972 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 79/140 2026-03-25T15:27:36.972 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 80/140 2026-03-25T15:27:36.972 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-packaging-20.9-5.el9.noarch 81/140 2026-03-25T15:27:36.972 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-protobuf-3.14.0-17.el9.noarch 82/140 2026-03-25T15:27:36.972 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 83/140 2026-03-25T15:27:36.972 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 84/140 2026-03-25T15:27:36.972 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 85/140 2026-03-25T15:27:36.972 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 86/140 2026-03-25T15:27:36.972 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 87/140 2026-03-25T15:27:36.972 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : qatlib-25.08.0-2.el9.x86_64 88/140 2026-03-25T15:27:36.973 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : qatlib-service-25.08.0-2.el9.x86_64 89/140 2026-03-25T15:27:36.973 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : qatzip-libs-1.3.1-1.el9.x86_64 90/140 2026-03-25T15:27:36.973 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 91/140 2026-03-25T15:27:36.973 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 92/140 2026-03-25T15:27:36.973 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : lua-devel-5.4.4-4.el9.x86_64 93/140 2026-03-25T15:27:36.973 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : protobuf-compiler-3.14.0-17.el9.x86_64 94/140 2026-03-25T15:27:36.973 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : abseil-cpp-20211102.0-4.el9.x86_64 95/140 2026-03-25T15:27:36.973 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 96/140 2026-03-25T15:27:36.973 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : grpc-data-1.46.7-10.el9.noarch 97/140 2026-03-25T15:27:36.973 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 98/140 2026-03-25T15:27:36.973 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 99/140 2026-03-25T15:27:36.973 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 100/140 2026-03-25T15:27:36.974 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 101/140 2026-03-25T15:27:36.974 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : luarocks-3.9.2-5.el9.noarch 102/140 2026-03-25T15:27:36.974 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 103/140 2026-03-25T15:27:36.974 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 104/140 2026-03-25T15:27:36.974 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 105/140 2026-03-25T15:27:36.974 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 106/140 2026-03-25T15:27:36.974 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 107/140 2026-03-25T15:27:36.974 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 108/140 2026-03-25T15:27:36.974 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 109/140 2026-03-25T15:27:36.974 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cheroot-10.0.1-4.el9.noarch 110/140 2026-03-25T15:27:36.974 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cherrypy-18.6.1-2.el9.noarch 111/140 2026-03-25T15:27:36.974 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 112/140 2026-03-25T15:27:36.974 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-grpcio-1.46.7-10.el9.x86_64 113/140 2026-03-25T15:27:36.974 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-grpcio-tools-1.46.7-10.el9.x86_64 114/140 2026-03-25T15:27:36.974 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 115/140 2026-03-25T15:27:36.974 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 116/140 2026-03-25T15:27:36.974 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 117/140 2026-03-25T15:27:36.974 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 118/140 2026-03-25T15:27:36.974 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 119/140 2026-03-25T15:27:36.974 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 120/140 2026-03-25T15:27:36.974 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 121/140 2026-03-25T15:27:36.974 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 122/140 2026-03-25T15:27:36.974 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 123/140 2026-03-25T15:27:36.975 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 124/140 2026-03-25T15:27:36.975 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 125/140 2026-03-25T15:27:36.975 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 126/140 2026-03-25T15:27:36.975 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 127/140 2026-03-25T15:27:36.975 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 128/140 2026-03-25T15:27:36.975 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 129/140 2026-03-25T15:27:36.975 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 130/140 2026-03-25T15:27:36.975 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 131/140 2026-03-25T15:27:36.975 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-xmltodict-0.12.0-15.el9.noarch 132/140 2026-03-25T15:27:36.975 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 133/140 2026-03-25T15:27:36.975 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : re2-1:20211101-20.el9.x86_64 134/140 2026-03-25T15:27:36.975 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : s3cmd-2.4.0-1.el9.noarch 135/140 2026-03-25T15:27:36.975 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 136/140 2026-03-25T15:27:36.975 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librados2-2:20.2.0-712.g70f8415b.el9.x86_64 137/140 2026-03-25T15:27:36.975 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librados2-2:16.2.4-5.el9.x86_64 138/140 2026-03-25T15:27:36.975 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librbd1-2:20.2.0-712.g70f8415b.el9.x86_64 139/140 2026-03-25T15:27:37.558 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librbd1-2:16.2.4-5.el9.x86_64 140/140 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout:Upgraded: 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: librados2-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: librbd1-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout:Installed: 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: abseil-cpp-20211102.0-4.el9.x86_64 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: bzip2-1.0.8-11.el9.x86_64 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: ceph-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: ceph-base-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: ceph-common-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: ceph-fuse-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: ceph-grafana-dashboards-2:20.2.0-712.g70f8415b.el9.noarch 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: ceph-immutable-object-cache-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mds-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-cephadm-2:20.2.0-712.g70f8415b.el9.noarch 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-dashboard-2:20.2.0-712.g70f8415b.el9.noarch 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-diskprediction-local-2:20.2.0-712.g70f8415b.el9.noarch 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-modules-core-2:20.2.0-712.g70f8415b.el9.noarch 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-rook-2:20.2.0-712.g70f8415b.el9.noarch 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mon-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: ceph-osd-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: ceph-prometheus-alerts-2:20.2.0-712.g70f8415b.el9.noarch 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: ceph-radosgw-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: ceph-selinux-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: ceph-test-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: ceph-volume-2:20.2.0-712.g70f8415b.el9.noarch 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: cephadm-2:20.2.0-712.g70f8415b.el9.noarch 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: cryptsetup-2.8.1-3.el9.x86_64 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: fuse-2.9.9-17.el9.x86_64 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: grpc-data-1.46.7-10.el9.noarch 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: ledmon-libs-1.1.0-3.el9.x86_64 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: libcephfs-devel-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: libcephfs-proxy2-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: libcephfs2-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: libcephsqlite-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: libconfig-1.7.2-9.el9.x86_64 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: libnbd-1.20.3-4.el9.x86_64 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: librados-devel-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: libradosstriper1-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-25T15:27:37.559 INFO:teuthology.orchestra.run.vm04.stdout: librgw2-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: libxslt-1.1.34-12.el9.x86_64 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: lua-5.4.4-4.el9.x86_64 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: lua-devel-5.4.4-4.el9.x86_64 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: luarocks-3.9.2-5.el9.noarch 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: mailcap-2.1.49-5.el9.noarch 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: pciutils-3.7.0-7.el9.x86_64 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: perl-Benchmark-1.23-483.el9.noarch 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: perl-Test-Harness-1:3.42-461.el9.noarch 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: protobuf-3.14.0-17.el9.x86_64 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: protobuf-compiler-3.14.0-17.el9.x86_64 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: python3-ceph-argparse-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: python3-ceph-common-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: python3-cephfs-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: python3-cheroot-10.0.1-4.el9.noarch 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: python3-cherrypy-18.6.1-2.el9.noarch 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: python3-grpcio-1.46.7-10.el9.x86_64 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: python3-grpcio-tools-1.46.7-10.el9.x86_64 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: python3-jmespath-1.0.1-1.el9.noarch 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: python3-packaging-20.9-5.el9.noarch 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: python3-ply-3.11-14.el9.noarch 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: python3-protobuf-3.14.0-17.el9.noarch 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyparsing-2.4.7-9.el9.noarch 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: python3-rados-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:27:37.560 INFO:teuthology.orchestra.run.vm04.stdout: python3-rbd-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:27:37.561 INFO:teuthology.orchestra.run.vm04.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-25T15:27:37.561 INFO:teuthology.orchestra.run.vm04.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-25T15:27:37.561 INFO:teuthology.orchestra.run.vm04.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-25T15:27:37.561 INFO:teuthology.orchestra.run.vm04.stdout: python3-rgw-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:27:37.561 INFO:teuthology.orchestra.run.vm04.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-25T15:27:37.561 INFO:teuthology.orchestra.run.vm04.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-25T15:27:37.561 INFO:teuthology.orchestra.run.vm04.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-25T15:27:37.561 INFO:teuthology.orchestra.run.vm04.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-25T15:27:37.561 INFO:teuthology.orchestra.run.vm04.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-25T15:27:37.561 INFO:teuthology.orchestra.run.vm04.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-25T15:27:37.561 INFO:teuthology.orchestra.run.vm04.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-25T15:27:37.561 INFO:teuthology.orchestra.run.vm04.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-25T15:27:37.561 INFO:teuthology.orchestra.run.vm04.stdout: python3-xmltodict-0.12.0-15.el9.noarch 2026-03-25T15:27:37.561 INFO:teuthology.orchestra.run.vm04.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-25T15:27:37.561 INFO:teuthology.orchestra.run.vm04.stdout: qatlib-25.08.0-2.el9.x86_64 2026-03-25T15:27:37.561 INFO:teuthology.orchestra.run.vm04.stdout: qatlib-service-25.08.0-2.el9.x86_64 2026-03-25T15:27:37.561 INFO:teuthology.orchestra.run.vm04.stdout: qatzip-libs-1.3.1-1.el9.x86_64 2026-03-25T15:27:37.561 INFO:teuthology.orchestra.run.vm04.stdout: rbd-fuse-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:27:37.561 INFO:teuthology.orchestra.run.vm04.stdout: rbd-mirror-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:27:37.561 INFO:teuthology.orchestra.run.vm04.stdout: rbd-nbd-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:27:37.561 INFO:teuthology.orchestra.run.vm04.stdout: re2-1:20211101-20.el9.x86_64 2026-03-25T15:27:37.561 INFO:teuthology.orchestra.run.vm04.stdout: s3cmd-2.4.0-1.el9.noarch 2026-03-25T15:27:37.561 INFO:teuthology.orchestra.run.vm04.stdout: socat-1.7.4.1-8.el9.x86_64 2026-03-25T15:27:37.561 INFO:teuthology.orchestra.run.vm04.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-25T15:27:37.561 INFO:teuthology.orchestra.run.vm04.stdout: unzip-6.0-59.el9.x86_64 2026-03-25T15:27:37.561 INFO:teuthology.orchestra.run.vm04.stdout: xmlstarlet-1.6.1-20.el9.x86_64 2026-03-25T15:27:37.561 INFO:teuthology.orchestra.run.vm04.stdout: zip-3.0-35.el9.x86_64 2026-03-25T15:27:37.561 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:37.561 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-25T15:27:37.720 DEBUG:teuthology.parallel:result is None 2026-03-25T15:27:37.720 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=70f8415b300f041766fa27faf7d5472699e32388 2026-03-25T15:27:38.355 DEBUG:teuthology.orchestra.run.vm04:> rpm -q ceph --qf '%{VERSION}-%{RELEASE}' 2026-03-25T15:27:38.377 INFO:teuthology.orchestra.run.vm04.stdout:20.2.0-712.g70f8415b.el9 2026-03-25T15:27:38.378 INFO:teuthology.packaging:The installed version of ceph is 20.2.0-712.g70f8415b.el9 2026-03-25T15:27:38.378 INFO:teuthology.task.install:The correct ceph version 20.2.0-712.g70f8415b is installed. 2026-03-25T15:27:38.379 INFO:teuthology.task.install.util:Shipping valgrind.supp... 2026-03-25T15:27:38.379 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-25T15:27:38.379 DEBUG:teuthology.orchestra.run.vm04:> sudo dd of=/home/ubuntu/cephtest/valgrind.supp 2026-03-25T15:27:38.448 INFO:teuthology.task.install.util:Shipping 'daemon-helper'... 2026-03-25T15:27:38.448 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-25T15:27:38.448 DEBUG:teuthology.orchestra.run.vm04:> sudo dd of=/usr/bin/daemon-helper 2026-03-25T15:27:38.517 DEBUG:teuthology.orchestra.run.vm04:> sudo chmod a=rx -- /usr/bin/daemon-helper 2026-03-25T15:27:38.583 INFO:teuthology.task.install.util:Shipping 'adjust-ulimits'... 2026-03-25T15:27:38.583 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-25T15:27:38.583 DEBUG:teuthology.orchestra.run.vm04:> sudo dd of=/usr/bin/adjust-ulimits 2026-03-25T15:27:38.649 DEBUG:teuthology.orchestra.run.vm04:> sudo chmod a=rx -- /usr/bin/adjust-ulimits 2026-03-25T15:27:38.712 INFO:teuthology.task.install.util:Shipping 'stdin-killer'... 2026-03-25T15:27:38.712 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-25T15:27:38.712 DEBUG:teuthology.orchestra.run.vm04:> sudo dd of=/usr/bin/stdin-killer 2026-03-25T15:27:38.777 DEBUG:teuthology.orchestra.run.vm04:> sudo chmod a=rx -- /usr/bin/stdin-killer 2026-03-25T15:27:38.843 INFO:teuthology.run_tasks:Running task ceph... 2026-03-25T15:27:38.887 INFO:tasks.ceph:Making ceph log dir writeable by non-root... 2026-03-25T15:27:38.887 DEBUG:teuthology.orchestra.run.vm04:> sudo chmod 777 /var/log/ceph 2026-03-25T15:27:38.910 INFO:tasks.ceph:Disabling ceph logrotate... 2026-03-25T15:27:38.910 DEBUG:teuthology.orchestra.run.vm04:> sudo rm -f -- /etc/logrotate.d/ceph 2026-03-25T15:27:38.976 INFO:tasks.ceph:Creating extra log directories... 2026-03-25T15:27:38.976 DEBUG:teuthology.orchestra.run.vm04:> sudo install -d -m0777 -- /var/log/ceph/valgrind /var/log/ceph/profiling-logger 2026-03-25T15:27:39.051 INFO:tasks.ceph:Creating ceph cluster ceph... 2026-03-25T15:27:39.051 INFO:tasks.ceph:config {'conf': {'client': {'rbd default format': 1}, 'global': {'mon client directed command retry': 5, 'mon warn on pool no app': False, 'ms inject socket failures': 5000}, 'mgr': {'debug mgr': 20, 'debug ms': 1}, 'mon': {'debug mon': 20, 'debug ms': 1, 'debug paxos': 20}, 'osd': {'bluefs allocator': 'hybrid', 'bluestore allocator': 'hybrid', 'bluestore block size': 96636764160, 'bluestore fsck on mount': True, 'debug bluefs': '1/20', 'debug bluestore': '1/20', 'debug ms': 1, 'debug osd': 20, 'debug rocksdb': '4/10', 'mon osd backfillfull_ratio': 0.85, 'mon osd full ratio': 0.9, 'mon osd nearfull ratio': 0.8, 'osd failsafe full ratio': 0.95, 'osd mclock iops capacity threshold hdd': 49000, 'osd objectstore': 'bluestore', 'osd shutdown pgref assert': True}}, 'fs': 'xfs', 'mkfs_options': None, 'mount_options': None, 'skip_mgr_daemons': False, 'log_ignorelist': ['\\(MDS_ALL_DOWN\\)', '\\(MDS_UP_LESS_THAN_MAX\\)', '\\(OSD_SLOW_PING_TIME'], 'cpu_profile': set(), 'cluster': 'ceph', 'mon_bind_msgr2': True, 'mon_bind_addrvec': True} 2026-03-25T15:27:39.051 INFO:tasks.ceph:ctx.config {'archive_path': '/archive/kyr-2026-03-20_22:04:26-rbd-tentacle-none-default-vps/3645', 'branch': 'tentacle', 'description': 'rbd/cli_v1/{base/install clusters/{fixed-1} conf/{disable-pool-app} features/format-1 msgr-failures/few objectstore/bluestore-hybrid supported-random-distro$/{centos_latest} workloads/rbd_cli_generic}', 'email': None, 'first_in_suite': False, 'flavor': 'default', 'job_id': '3645', 'last_in_suite': False, 'machine_type': 'vps', 'name': 'kyr-2026-03-20_22:04:26-rbd-tentacle-none-default-vps', 'no_nested_subset': False, 'os_type': 'centos', 'os_version': '9.stream', 'overrides': {'admin_socket': {'branch': 'tentacle'}, 'ansible.cephlab': {'branch': 'main', 'repo': 'https://github.com/kshtsk/ceph-cm-ansible.git', 'skip_tags': 'nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs', 'vars': {'logical_volumes': {'lv_1': {'scratch_dev': True, 'size': '25%VG', 'vg': 'vg_nvme'}, 'lv_2': {'scratch_dev': True, 'size': '25%VG', 'vg': 'vg_nvme'}, 'lv_3': {'scratch_dev': True, 'size': '25%VG', 'vg': 'vg_nvme'}, 'lv_4': {'scratch_dev': True, 'size': '25%VG', 'vg': 'vg_nvme'}}, 'timezone': 'UTC', 'volume_groups': {'vg_nvme': {'pvs': '/dev/vdb,/dev/vdc,/dev/vdd,/dev/vde'}}}}, 'ceph': {'conf': {'client': {'rbd default format': 1}, 'global': {'mon client directed command retry': 5, 'mon warn on pool no app': False, 'ms inject socket failures': 5000}, 'mgr': {'debug mgr': 20, 'debug ms': 1}, 'mon': {'debug mon': 20, 'debug ms': 1, 'debug paxos': 20}, 'osd': {'bluefs allocator': 'hybrid', 'bluestore allocator': 'hybrid', 'bluestore block size': 96636764160, 'bluestore fsck on mount': True, 'debug bluefs': '1/20', 'debug bluestore': '1/20', 'debug ms': 1, 'debug osd': 20, 'debug rocksdb': '4/10', 'mon osd backfillfull_ratio': 0.85, 'mon osd full ratio': 0.9, 'mon osd nearfull ratio': 0.8, 'osd failsafe full ratio': 0.95, 'osd mclock iops capacity threshold hdd': 49000, 'osd objectstore': 'bluestore', 'osd shutdown pgref assert': True}}, 'flavor': 'default', 'fs': 'xfs', 'log-ignorelist': ['\\(MDS_ALL_DOWN\\)', '\\(MDS_UP_LESS_THAN_MAX\\)', '\\(OSD_SLOW_PING_TIME'], 'sha1': '70f8415b300f041766fa27faf7d5472699e32388'}, 'ceph-deploy': {'bluestore': True, 'conf': {'client': {'log file': '/var/log/ceph/ceph-$name.$pid.log'}, 'global': {'osd crush chooseleaf type': 0, 'osd pool default pg num': 128, 'osd pool default pgp num': 128, 'osd pool default size': 2}, 'mon': {}, 'osd': {'bluestore block size': 96636764160, 'bluestore fsck on mount': True, 'debug bluefs': '1/20', 'debug bluestore': '1/20', 'debug rocksdb': '4/10', 'mon osd backfillfull_ratio': 0.85, 'mon osd full ratio': 0.9, 'mon osd nearfull ratio': 0.8, 'osd failsafe full ratio': 0.95, 'osd objectstore': 'bluestore'}}, 'fs': 'xfs'}, 'cephadm': {'cephadm_binary_url': 'https://download.ceph.com/rpm-20.2.0/el9/noarch/cephadm'}, 'install': {'ceph': {'flavor': 'default', 'sha1': '70f8415b300f041766fa27faf7d5472699e32388'}, 'extra_system_packages': {'deb': ['python3-jmespath', 'python3-xmltodict', 's3cmd'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-jmespath', 'python3-xmltodict', 's3cmd']}}, 'thrashosds': {'bdev_inject_crash': 2, 'bdev_inject_crash_probability': 0.5}, 'workunit': {'branch': 'tt-tentacle', 'sha1': '0392f78529848ec72469e8e431875cb98d3a5fb4'}}, 'owner': 'kyr', 'priority': 1000, 'repo': 'https://github.com/ceph/ceph.git', 'roles': [['mon.a', 'mgr.x', 'osd.0', 'osd.1', 'osd.2', 'client.0']], 'seed': 3051, 'sha1': '70f8415b300f041766fa27faf7d5472699e32388', 'sleep_before_teardown': 0, 'subset': '1/128', 'suite': 'rbd', 'suite_branch': 'tt-tentacle', 'suite_path': '/home/teuthos/src/github.com_kshtsk_ceph_0392f78529848ec72469e8e431875cb98d3a5fb4/qa', 'suite_relpath': 'qa', 'suite_repo': 'https://github.com/kshtsk/ceph.git', 'suite_sha1': '0392f78529848ec72469e8e431875cb98d3a5fb4', 'targets': {'vm04.local': 'ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBMVY6yyFd+yKb0Czj+cE3uCZuJasW7ZXLrRixIHZmdx8en0TD8LaabV085w2jTe0EUcfOMcPCxMAA53W+PlTLSM='}, 'tasks': [{'internal.check_packages': None}, {'internal.buildpackages_prep': None}, {'internal.save_config': None}, {'internal.check_lock': None}, {'internal.add_remotes': None}, {'console_log': None}, {'internal.connect': None}, {'internal.push_inventory': None}, {'internal.serialize_remote_roles': None}, {'internal.check_conflict': None}, {'internal.check_ceph_data': None}, {'internal.vm_setup': None}, {'internal.base': None}, {'internal.archive_upload': None}, {'internal.archive': None}, {'internal.coredump': None}, {'internal.sudo': None}, {'internal.syslog': None}, {'internal.timer': None}, {'pcp': None}, {'selinux': None}, {'ansible.cephlab': None}, {'clock': None}, {'install': None}, {'ceph': None}, {'workunit': {'clients': {'client.0': ['rbd/cli_generic.sh']}}}], 'teuthology': {'fragments_dropped': [], 'meta': {}, 'postmerge': []}, 'teuthology_branch': 'clyso-debian-13', 'teuthology_repo': 'https://github.com/clyso/teuthology', 'teuthology_sha1': '1c580df7a9c7c2aadc272da296344fd99f27c444', 'timestamp': '2026-03-20_22:04:26', 'tube': 'vps', 'user': 'kyr', 'verbose': False, 'worker_log': '/home/teuthos/.teuthology/dispatcher/dispatcher.vps.3013333'} 2026-03-25T15:27:39.051 DEBUG:teuthology.orchestra.run.vm04:> install -d -m0755 -- /home/ubuntu/cephtest/ceph.data 2026-03-25T15:27:39.108 DEBUG:teuthology.orchestra.run.vm04:> sudo install -d -m0777 -- /var/run/ceph 2026-03-25T15:27:39.174 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-25T15:27:39.174 DEBUG:teuthology.orchestra.run.vm04:> dd if=/scratch_devs of=/dev/stdout 2026-03-25T15:27:39.232 DEBUG:teuthology.misc:devs=['/dev/vg_nvme/lv_1', '/dev/vg_nvme/lv_2', '/dev/vg_nvme/lv_3', '/dev/vg_nvme/lv_4'] 2026-03-25T15:27:39.232 DEBUG:teuthology.orchestra.run.vm04:> stat /dev/vg_nvme/lv_1 2026-03-25T15:27:39.299 INFO:teuthology.orchestra.run.vm04.stdout: File: /dev/vg_nvme/lv_1 -> ../dm-0 2026-03-25T15:27:39.299 INFO:teuthology.orchestra.run.vm04.stdout: Size: 7 Blocks: 0 IO Block: 4096 symbolic link 2026-03-25T15:27:39.299 INFO:teuthology.orchestra.run.vm04.stdout:Device: 6h/6d Inode: 638 Links: 1 2026-03-25T15:27:39.299 INFO:teuthology.orchestra.run.vm04.stdout:Access: (0777/lrwxrwxrwx) Uid: ( 0/ root) Gid: ( 0/ root) 2026-03-25T15:27:39.299 INFO:teuthology.orchestra.run.vm04.stdout:Context: system_u:object_r:device_t:s0 2026-03-25T15:27:39.299 INFO:teuthology.orchestra.run.vm04.stdout:Access: 2026-03-25 15:27:34.489617053 +0000 2026-03-25T15:27:39.299 INFO:teuthology.orchestra.run.vm04.stdout:Modify: 2026-03-25 15:26:06.109190508 +0000 2026-03-25T15:27:39.299 INFO:teuthology.orchestra.run.vm04.stdout:Change: 2026-03-25 15:26:06.109190508 +0000 2026-03-25T15:27:39.299 INFO:teuthology.orchestra.run.vm04.stdout: Birth: 2026-03-25 15:26:06.109190508 +0000 2026-03-25T15:27:39.300 DEBUG:teuthology.orchestra.run.vm04:> sudo dd if=/dev/vg_nvme/lv_1 of=/dev/null count=1 2026-03-25T15:27:39.375 INFO:teuthology.orchestra.run.vm04.stderr:1+0 records in 2026-03-25T15:27:39.375 INFO:teuthology.orchestra.run.vm04.stderr:1+0 records out 2026-03-25T15:27:39.375 INFO:teuthology.orchestra.run.vm04.stderr:512 bytes copied, 0.000213529 s, 2.4 MB/s 2026-03-25T15:27:39.376 DEBUG:teuthology.orchestra.run.vm04:> ! mount | grep -v devtmpfs | grep -q /dev/vg_nvme/lv_1 2026-03-25T15:27:39.448 DEBUG:teuthology.orchestra.run.vm04:> stat /dev/vg_nvme/lv_2 2026-03-25T15:27:39.514 INFO:teuthology.orchestra.run.vm04.stdout: File: /dev/vg_nvme/lv_2 -> ../dm-1 2026-03-25T15:27:39.514 INFO:teuthology.orchestra.run.vm04.stdout: Size: 7 Blocks: 0 IO Block: 4096 symbolic link 2026-03-25T15:27:39.514 INFO:teuthology.orchestra.run.vm04.stdout:Device: 6h/6d Inode: 700 Links: 1 2026-03-25T15:27:39.514 INFO:teuthology.orchestra.run.vm04.stdout:Access: (0777/lrwxrwxrwx) Uid: ( 0/ root) Gid: ( 0/ root) 2026-03-25T15:27:39.514 INFO:teuthology.orchestra.run.vm04.stdout:Context: system_u:object_r:device_t:s0 2026-03-25T15:27:39.514 INFO:teuthology.orchestra.run.vm04.stdout:Access: 2026-03-25 15:27:34.489617053 +0000 2026-03-25T15:27:39.514 INFO:teuthology.orchestra.run.vm04.stdout:Modify: 2026-03-25 15:26:06.328191049 +0000 2026-03-25T15:27:39.514 INFO:teuthology.orchestra.run.vm04.stdout:Change: 2026-03-25 15:26:06.328191049 +0000 2026-03-25T15:27:39.514 INFO:teuthology.orchestra.run.vm04.stdout: Birth: 2026-03-25 15:26:06.328191049 +0000 2026-03-25T15:27:39.515 DEBUG:teuthology.orchestra.run.vm04:> sudo dd if=/dev/vg_nvme/lv_2 of=/dev/null count=1 2026-03-25T15:27:39.581 INFO:teuthology.orchestra.run.vm04.stderr:1+0 records in 2026-03-25T15:27:39.582 INFO:teuthology.orchestra.run.vm04.stderr:1+0 records out 2026-03-25T15:27:39.582 INFO:teuthology.orchestra.run.vm04.stderr:512 bytes copied, 0.000171011 s, 3.0 MB/s 2026-03-25T15:27:39.583 DEBUG:teuthology.orchestra.run.vm04:> ! mount | grep -v devtmpfs | grep -q /dev/vg_nvme/lv_2 2026-03-25T15:27:39.639 DEBUG:teuthology.orchestra.run.vm04:> stat /dev/vg_nvme/lv_3 2026-03-25T15:27:39.697 INFO:teuthology.orchestra.run.vm04.stdout: File: /dev/vg_nvme/lv_3 -> ../dm-2 2026-03-25T15:27:39.697 INFO:teuthology.orchestra.run.vm04.stdout: Size: 7 Blocks: 0 IO Block: 4096 symbolic link 2026-03-25T15:27:39.697 INFO:teuthology.orchestra.run.vm04.stdout:Device: 6h/6d Inode: 737 Links: 1 2026-03-25T15:27:39.697 INFO:teuthology.orchestra.run.vm04.stdout:Access: (0777/lrwxrwxrwx) Uid: ( 0/ root) Gid: ( 0/ root) 2026-03-25T15:27:39.697 INFO:teuthology.orchestra.run.vm04.stdout:Context: system_u:object_r:device_t:s0 2026-03-25T15:27:39.697 INFO:teuthology.orchestra.run.vm04.stdout:Access: 2026-03-25 15:27:34.490617053 +0000 2026-03-25T15:27:39.697 INFO:teuthology.orchestra.run.vm04.stdout:Modify: 2026-03-25 15:26:06.601191722 +0000 2026-03-25T15:27:39.697 INFO:teuthology.orchestra.run.vm04.stdout:Change: 2026-03-25 15:26:06.601191722 +0000 2026-03-25T15:27:39.697 INFO:teuthology.orchestra.run.vm04.stdout: Birth: 2026-03-25 15:26:06.601191722 +0000 2026-03-25T15:27:39.698 DEBUG:teuthology.orchestra.run.vm04:> sudo dd if=/dev/vg_nvme/lv_3 of=/dev/null count=1 2026-03-25T15:27:39.761 INFO:teuthology.orchestra.run.vm04.stderr:1+0 records in 2026-03-25T15:27:39.761 INFO:teuthology.orchestra.run.vm04.stderr:1+0 records out 2026-03-25T15:27:39.762 INFO:teuthology.orchestra.run.vm04.stderr:512 bytes copied, 0.000159259 s, 3.2 MB/s 2026-03-25T15:27:39.763 DEBUG:teuthology.orchestra.run.vm04:> ! mount | grep -v devtmpfs | grep -q /dev/vg_nvme/lv_3 2026-03-25T15:27:39.819 DEBUG:teuthology.orchestra.run.vm04:> stat /dev/vg_nvme/lv_4 2026-03-25T15:27:39.881 INFO:teuthology.orchestra.run.vm04.stdout: File: /dev/vg_nvme/lv_4 -> ../dm-3 2026-03-25T15:27:39.881 INFO:teuthology.orchestra.run.vm04.stdout: Size: 7 Blocks: 0 IO Block: 4096 symbolic link 2026-03-25T15:27:39.881 INFO:teuthology.orchestra.run.vm04.stdout:Device: 6h/6d Inode: 753 Links: 1 2026-03-25T15:27:39.881 INFO:teuthology.orchestra.run.vm04.stdout:Access: (0777/lrwxrwxrwx) Uid: ( 0/ root) Gid: ( 0/ root) 2026-03-25T15:27:39.881 INFO:teuthology.orchestra.run.vm04.stdout:Context: system_u:object_r:device_t:s0 2026-03-25T15:27:39.881 INFO:teuthology.orchestra.run.vm04.stdout:Access: 2026-03-25 15:27:34.490617053 +0000 2026-03-25T15:27:39.881 INFO:teuthology.orchestra.run.vm04.stdout:Modify: 2026-03-25 15:26:06.859192359 +0000 2026-03-25T15:27:39.881 INFO:teuthology.orchestra.run.vm04.stdout:Change: 2026-03-25 15:26:06.859192359 +0000 2026-03-25T15:27:39.881 INFO:teuthology.orchestra.run.vm04.stdout: Birth: 2026-03-25 15:26:06.859192359 +0000 2026-03-25T15:27:39.881 DEBUG:teuthology.orchestra.run.vm04:> sudo dd if=/dev/vg_nvme/lv_4 of=/dev/null count=1 2026-03-25T15:27:39.948 INFO:teuthology.orchestra.run.vm04.stderr:1+0 records in 2026-03-25T15:27:39.948 INFO:teuthology.orchestra.run.vm04.stderr:1+0 records out 2026-03-25T15:27:39.948 INFO:teuthology.orchestra.run.vm04.stderr:512 bytes copied, 0.000232915 s, 2.2 MB/s 2026-03-25T15:27:39.949 DEBUG:teuthology.orchestra.run.vm04:> ! mount | grep -v devtmpfs | grep -q /dev/vg_nvme/lv_4 2026-03-25T15:27:40.010 INFO:tasks.ceph:osd dev map: {'osd.0': '/dev/vg_nvme/lv_1', 'osd.1': '/dev/vg_nvme/lv_2', 'osd.2': '/dev/vg_nvme/lv_3'} 2026-03-25T15:27:40.010 INFO:tasks.ceph:remote_to_roles_to_devs: {Remote(name='ubuntu@vm04.local'): {'osd.0': '/dev/vg_nvme/lv_1', 'osd.1': '/dev/vg_nvme/lv_2', 'osd.2': '/dev/vg_nvme/lv_3'}} 2026-03-25T15:27:40.010 INFO:tasks.ceph:Generating config... 2026-03-25T15:27:40.011 INFO:tasks.ceph:[client] rbd default format = 1 2026-03-25T15:27:40.011 INFO:tasks.ceph:[global] mon client directed command retry = 5 2026-03-25T15:27:40.011 INFO:tasks.ceph:[global] mon warn on pool no app = False 2026-03-25T15:27:40.011 INFO:tasks.ceph:[global] ms inject socket failures = 5000 2026-03-25T15:27:40.011 INFO:tasks.ceph:[mgr] debug mgr = 20 2026-03-25T15:27:40.011 INFO:tasks.ceph:[mgr] debug ms = 1 2026-03-25T15:27:40.011 INFO:tasks.ceph:[mon] debug mon = 20 2026-03-25T15:27:40.011 INFO:tasks.ceph:[mon] debug ms = 1 2026-03-25T15:27:40.011 INFO:tasks.ceph:[mon] debug paxos = 20 2026-03-25T15:27:40.011 INFO:tasks.ceph:[osd] bluefs allocator = hybrid 2026-03-25T15:27:40.011 INFO:tasks.ceph:[osd] bluestore allocator = hybrid 2026-03-25T15:27:40.011 INFO:tasks.ceph:[osd] bluestore block size = 96636764160 2026-03-25T15:27:40.011 INFO:tasks.ceph:[osd] bluestore fsck on mount = True 2026-03-25T15:27:40.011 INFO:tasks.ceph:[osd] debug bluefs = 1/20 2026-03-25T15:27:40.011 INFO:tasks.ceph:[osd] debug bluestore = 1/20 2026-03-25T15:27:40.011 INFO:tasks.ceph:[osd] debug ms = 1 2026-03-25T15:27:40.011 INFO:tasks.ceph:[osd] debug osd = 20 2026-03-25T15:27:40.011 INFO:tasks.ceph:[osd] debug rocksdb = 4/10 2026-03-25T15:27:40.011 INFO:tasks.ceph:[osd] mon osd backfillfull_ratio = 0.85 2026-03-25T15:27:40.011 INFO:tasks.ceph:[osd] mon osd full ratio = 0.9 2026-03-25T15:27:40.011 INFO:tasks.ceph:[osd] mon osd nearfull ratio = 0.8 2026-03-25T15:27:40.011 INFO:tasks.ceph:[osd] osd failsafe full ratio = 0.95 2026-03-25T15:27:40.011 INFO:tasks.ceph:[osd] osd mclock iops capacity threshold hdd = 49000 2026-03-25T15:27:40.011 INFO:tasks.ceph:[osd] osd objectstore = bluestore 2026-03-25T15:27:40.011 INFO:tasks.ceph:[osd] osd shutdown pgref assert = True 2026-03-25T15:27:40.011 INFO:tasks.ceph:Setting up mon.a... 2026-03-25T15:27:40.012 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool --create-keyring /etc/ceph/ceph.keyring 2026-03-25T15:27:40.089 INFO:teuthology.orchestra.run.vm04.stdout:creating /etc/ceph/ceph.keyring 2026-03-25T15:27:40.092 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool --gen-key --name=mon. /etc/ceph/ceph.keyring 2026-03-25T15:27:40.180 DEBUG:teuthology.orchestra.run.vm04:> sudo chmod 0644 /etc/ceph/ceph.keyring 2026-03-25T15:27:40.214 DEBUG:tasks.ceph:Ceph mon addresses: [('mon.a', '192.168.123.104')] 2026-03-25T15:27:40.214 DEBUG:tasks.ceph:writing out conf {'global': {'chdir': '', 'pid file': '/var/run/ceph/$cluster-$name.pid', 'auth supported': 'cephx', 'filestore xattr use omap': 'true', 'mon clock drift allowed': '1.000', 'osd crush chooseleaf type': '0', 'auth debug': 'true', 'ms die on old message': 'true', 'ms die on bug': 'true', 'mon max pg per osd': '10000', 'mon pg warn max object skew': '0', 'osd_pool_default_pg_autoscale_mode': 'off', 'osd pool default size': '2', 'mon osd allow primary affinity': 'true', 'mon osd allow pg remap': 'true', 'mon warn on legacy crush tunables': 'false', 'mon warn on crush straw calc version zero': 'false', 'mon warn on no sortbitwise': 'false', 'mon warn on osd down out interval zero': 'false', 'mon warn on too few osds': 'false', 'mon_warn_on_pool_pg_num_not_power_of_two': 'false', 'mon_warn_on_pool_no_redundancy': 'false', 'mon_allow_pool_size_one': 'true', 'osd pool default erasure code profile': 'plugin=isa technique=reed_sol_van k=2 m=1 crush-failure-domain=osd', 'osd default data pool replay window': '5', 'mon allow pool delete': 'true', 'mon cluster log file level': 'debug', 'debug asserts on shutdown': 'true', 'mon health detail to clog': 'false', 'mon host': '192.168.123.104', 'mon client directed command retry': 5, 'mon warn on pool no app': False, 'ms inject socket failures': 5000}, 'osd': {'osd journal size': '100', 'osd scrub load threshold': '5.0', 'osd scrub max interval': '600', 'osd mclock profile': 'high_recovery_ops', 'osd mclock skip benchmark': 'true', 'osd recover clone overlap': 'true', 'osd recovery max chunk': '1048576', 'osd debug shutdown': 'true', 'osd debug op order': 'true', 'osd debug verify stray on activate': 'true', 'osd debug trim objects': 'true', 'osd open classes on start': 'true', 'osd debug pg log writeout': 'true', 'osd deep scrub update digest min age': '30', 'osd map max advance': '10', 'journal zero on create': 'true', 'filestore ondisk finisher threads': '3', 'filestore apply finisher threads': '3', 'bdev debug aio': 'true', 'osd debug misdirected ops': 'true', 'bluefs allocator': 'hybrid', 'bluestore allocator': 'hybrid', 'bluestore block size': 96636764160, 'bluestore fsck on mount': True, 'debug bluefs': '1/20', 'debug bluestore': '1/20', 'debug ms': 1, 'debug osd': 20, 'debug rocksdb': '4/10', 'mon osd backfillfull_ratio': 0.85, 'mon osd full ratio': 0.9, 'mon osd nearfull ratio': 0.8, 'osd failsafe full ratio': 0.95, 'osd mclock iops capacity threshold hdd': 49000, 'osd objectstore': 'bluestore', 'osd shutdown pgref assert': True}, 'mgr': {'debug ms': 1, 'debug mgr': 20, 'debug mon': '20', 'debug auth': '20', 'mon reweight min pgs per osd': '4', 'mon reweight min bytes per osd': '10', 'mgr/telemetry/nag': 'false'}, 'mon': {'debug ms': 1, 'debug mon': 20, 'debug paxos': 20, 'debug auth': '20', 'mon data avail warn': '5', 'mon mgr mkfs grace': '240', 'mon reweight min pgs per osd': '4', 'mon osd reporter subtree level': 'osd', 'mon osd prime pg temp': 'true', 'mon reweight min bytes per osd': '10', 'auth mon ticket ttl': '660', 'auth service ticket ttl': '240', 'mon_warn_on_insecure_global_id_reclaim': 'false', 'mon_warn_on_insecure_global_id_reclaim_allowed': 'false', 'mon_down_mkfs_grace': '2m', 'mon_warn_on_filestore_osds': 'false'}, 'client': {'rgw cache enabled': 'true', 'rgw enable ops log': 'true', 'rgw enable usage log': 'true', 'log file': '/var/log/ceph/$cluster-$name.$pid.log', 'admin socket': '/var/run/ceph/$cluster-$name.$pid.asok', 'rbd default format': 1}, 'mon.a': {}} 2026-03-25T15:27:40.215 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-25T15:27:40.215 DEBUG:teuthology.orchestra.run.vm04:> dd of=/home/ubuntu/cephtest/ceph.tmp.conf 2026-03-25T15:27:40.274 DEBUG:teuthology.orchestra.run.vm04:> adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage monmaptool -c /home/ubuntu/cephtest/ceph.tmp.conf --create --clobber --enable-all-features --add a 192.168.123.104 --print /home/ubuntu/cephtest/ceph.monmap 2026-03-25T15:27:40.354 INFO:teuthology.orchestra.run.vm04.stdout:monmaptool: monmap file /home/ubuntu/cephtest/ceph.monmap 2026-03-25T15:27:40.354 INFO:teuthology.orchestra.run.vm04.stdout:monmaptool: generated fsid 08196b8a-fd91-49b2-b8a6-e1d21f829086 2026-03-25T15:27:40.354 INFO:teuthology.orchestra.run.vm04.stdout:setting min_mon_release = tentacle 2026-03-25T15:27:40.354 INFO:teuthology.orchestra.run.vm04.stdout:epoch 0 2026-03-25T15:27:40.354 INFO:teuthology.orchestra.run.vm04.stdout:fsid 08196b8a-fd91-49b2-b8a6-e1d21f829086 2026-03-25T15:27:40.354 INFO:teuthology.orchestra.run.vm04.stdout:last_changed 2026-03-25T15:27:40.352797+0000 2026-03-25T15:27:40.354 INFO:teuthology.orchestra.run.vm04.stdout:created 2026-03-25T15:27:40.352797+0000 2026-03-25T15:27:40.354 INFO:teuthology.orchestra.run.vm04.stdout:min_mon_release 20 (tentacle) 2026-03-25T15:27:40.354 INFO:teuthology.orchestra.run.vm04.stdout:election_strategy: 1 2026-03-25T15:27:40.354 INFO:teuthology.orchestra.run.vm04.stdout:0: [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] mon.a 2026-03-25T15:27:40.354 INFO:teuthology.orchestra.run.vm04.stdout:monmaptool: writing epoch 0 to /home/ubuntu/cephtest/ceph.monmap (1 monitors) 2026-03-25T15:27:40.356 DEBUG:teuthology.orchestra.run.vm04:> rm -- /home/ubuntu/cephtest/ceph.tmp.conf 2026-03-25T15:27:40.417 INFO:tasks.ceph:Writing /etc/ceph/ceph.conf for FSID 08196b8a-fd91-49b2-b8a6-e1d21f829086... 2026-03-25T15:27:40.418 DEBUG:teuthology.orchestra.run.vm04:> sudo mkdir -p /etc/ceph && sudo chmod 0755 /etc/ceph && sudo tee /etc/ceph/ceph.conf && sudo chmod 0644 /etc/ceph/ceph.conf > /dev/null 2026-03-25T15:27:40.514 INFO:teuthology.orchestra.run.vm04.stdout:[global] 2026-03-25T15:27:40.514 INFO:teuthology.orchestra.run.vm04.stdout: chdir = "" 2026-03-25T15:27:40.514 INFO:teuthology.orchestra.run.vm04.stdout: pid file = /var/run/ceph/$cluster-$name.pid 2026-03-25T15:27:40.514 INFO:teuthology.orchestra.run.vm04.stdout: auth supported = cephx 2026-03-25T15:27:40.514 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:40.514 INFO:teuthology.orchestra.run.vm04.stdout: filestore xattr use omap = true 2026-03-25T15:27:40.514 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:40.514 INFO:teuthology.orchestra.run.vm04.stdout: mon clock drift allowed = 1.000 2026-03-25T15:27:40.514 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:40.514 INFO:teuthology.orchestra.run.vm04.stdout: osd crush chooseleaf type = 0 2026-03-25T15:27:40.514 INFO:teuthology.orchestra.run.vm04.stdout: auth debug = true 2026-03-25T15:27:40.514 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:40.514 INFO:teuthology.orchestra.run.vm04.stdout: ms die on old message = true 2026-03-25T15:27:40.514 INFO:teuthology.orchestra.run.vm04.stdout: ms die on bug = true 2026-03-25T15:27:40.514 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:40.514 INFO:teuthology.orchestra.run.vm04.stdout: mon max pg per osd = 10000 # >= luminous 2026-03-25T15:27:40.514 INFO:teuthology.orchestra.run.vm04.stdout: mon pg warn max object skew = 0 2026-03-25T15:27:40.514 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:40.514 INFO:teuthology.orchestra.run.vm04.stdout: # disable pg_autoscaler by default for new pools 2026-03-25T15:27:40.514 INFO:teuthology.orchestra.run.vm04.stdout: osd_pool_default_pg_autoscale_mode = off 2026-03-25T15:27:40.514 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:40.514 INFO:teuthology.orchestra.run.vm04.stdout: osd pool default size = 2 2026-03-25T15:27:40.514 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:40.514 INFO:teuthology.orchestra.run.vm04.stdout: mon osd allow primary affinity = true 2026-03-25T15:27:40.514 INFO:teuthology.orchestra.run.vm04.stdout: mon osd allow pg remap = true 2026-03-25T15:27:40.514 INFO:teuthology.orchestra.run.vm04.stdout: mon warn on legacy crush tunables = false 2026-03-25T15:27:40.514 INFO:teuthology.orchestra.run.vm04.stdout: mon warn on crush straw calc version zero = false 2026-03-25T15:27:40.514 INFO:teuthology.orchestra.run.vm04.stdout: mon warn on no sortbitwise = false 2026-03-25T15:27:40.514 INFO:teuthology.orchestra.run.vm04.stdout: mon warn on osd down out interval zero = false 2026-03-25T15:27:40.514 INFO:teuthology.orchestra.run.vm04.stdout: mon warn on too few osds = false 2026-03-25T15:27:40.514 INFO:teuthology.orchestra.run.vm04.stdout: mon_warn_on_pool_pg_num_not_power_of_two = false 2026-03-25T15:27:40.514 INFO:teuthology.orchestra.run.vm04.stdout: mon_warn_on_pool_no_redundancy = false 2026-03-25T15:27:40.514 INFO:teuthology.orchestra.run.vm04.stdout: mon_allow_pool_size_one = true 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: osd pool default erasure code profile = plugin=isa technique=reed_sol_van k=2 m=1 crush-failure-domain=osd 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: osd default data pool replay window = 5 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: mon allow pool delete = true 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: mon cluster log file level = debug 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: debug asserts on shutdown = true 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: mon health detail to clog = false 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: mon host = 192.168.123.104 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: mon client directed command retry = 5 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: mon warn on pool no app = False 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: ms inject socket failures = 5000 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: fsid = 08196b8a-fd91-49b2-b8a6-e1d21f829086 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout:[osd] 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: osd journal size = 100 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: osd scrub load threshold = 5.0 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: osd scrub max interval = 600 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: osd mclock profile = high_recovery_ops 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: osd mclock skip benchmark = true 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: osd recover clone overlap = true 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: osd recovery max chunk = 1048576 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: osd debug shutdown = true 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: osd debug op order = true 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: osd debug verify stray on activate = true 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: osd debug trim objects = true 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: osd open classes on start = true 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: osd debug pg log writeout = true 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: osd deep scrub update digest min age = 30 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: osd map max advance = 10 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: journal zero on create = true 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: filestore ondisk finisher threads = 3 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: filestore apply finisher threads = 3 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: bdev debug aio = true 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: osd debug misdirected ops = true 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: bluefs allocator = hybrid 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: bluestore allocator = hybrid 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: bluestore block size = 96636764160 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: bluestore fsck on mount = True 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: debug bluefs = 1/20 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: debug bluestore = 1/20 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: debug ms = 1 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: debug osd = 20 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: debug rocksdb = 4/10 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: mon osd backfillfull_ratio = 0.85 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: mon osd full ratio = 0.9 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: mon osd nearfull ratio = 0.8 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: osd failsafe full ratio = 0.95 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: osd mclock iops capacity threshold hdd = 49000 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: osd objectstore = bluestore 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: osd shutdown pgref assert = True 2026-03-25T15:27:40.515 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:40.516 INFO:teuthology.orchestra.run.vm04.stdout:[mgr] 2026-03-25T15:27:40.516 INFO:teuthology.orchestra.run.vm04.stdout: debug ms = 1 2026-03-25T15:27:40.516 INFO:teuthology.orchestra.run.vm04.stdout: debug mgr = 20 2026-03-25T15:27:40.516 INFO:teuthology.orchestra.run.vm04.stdout: debug mon = 20 2026-03-25T15:27:40.516 INFO:teuthology.orchestra.run.vm04.stdout: debug auth = 20 2026-03-25T15:27:40.516 INFO:teuthology.orchestra.run.vm04.stdout: mon reweight min pgs per osd = 4 2026-03-25T15:27:40.516 INFO:teuthology.orchestra.run.vm04.stdout: mon reweight min bytes per osd = 10 2026-03-25T15:27:40.516 INFO:teuthology.orchestra.run.vm04.stdout: mgr/telemetry/nag = false 2026-03-25T15:27:40.516 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:40.516 INFO:teuthology.orchestra.run.vm04.stdout:[mon] 2026-03-25T15:27:40.516 INFO:teuthology.orchestra.run.vm04.stdout: debug ms = 1 2026-03-25T15:27:40.516 INFO:teuthology.orchestra.run.vm04.stdout: debug mon = 20 2026-03-25T15:27:40.516 INFO:teuthology.orchestra.run.vm04.stdout: debug paxos = 20 2026-03-25T15:27:40.516 INFO:teuthology.orchestra.run.vm04.stdout: debug auth = 20 2026-03-25T15:27:40.516 INFO:teuthology.orchestra.run.vm04.stdout: mon data avail warn = 5 2026-03-25T15:27:40.516 INFO:teuthology.orchestra.run.vm04.stdout: mon mgr mkfs grace = 240 2026-03-25T15:27:40.516 INFO:teuthology.orchestra.run.vm04.stdout: mon reweight min pgs per osd = 4 2026-03-25T15:27:40.516 INFO:teuthology.orchestra.run.vm04.stdout: mon osd reporter subtree level = osd 2026-03-25T15:27:40.516 INFO:teuthology.orchestra.run.vm04.stdout: mon osd prime pg temp = true 2026-03-25T15:27:40.516 INFO:teuthology.orchestra.run.vm04.stdout: mon reweight min bytes per osd = 10 2026-03-25T15:27:40.516 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:40.516 INFO:teuthology.orchestra.run.vm04.stdout: # rotate auth tickets quickly to exercise renewal paths 2026-03-25T15:27:40.516 INFO:teuthology.orchestra.run.vm04.stdout: auth mon ticket ttl = 660 # 11m 2026-03-25T15:27:40.516 INFO:teuthology.orchestra.run.vm04.stdout: auth service ticket ttl = 240 # 4m 2026-03-25T15:27:40.516 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:40.516 INFO:teuthology.orchestra.run.vm04.stdout: # don't complain about insecure global_id in the test suite 2026-03-25T15:27:40.516 INFO:teuthology.orchestra.run.vm04.stdout: mon_warn_on_insecure_global_id_reclaim = false 2026-03-25T15:27:40.516 INFO:teuthology.orchestra.run.vm04.stdout: mon_warn_on_insecure_global_id_reclaim_allowed = false 2026-03-25T15:27:40.516 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:40.516 INFO:teuthology.orchestra.run.vm04.stdout: # 1m isn't quite enough 2026-03-25T15:27:40.516 INFO:teuthology.orchestra.run.vm04.stdout: mon_down_mkfs_grace = 2m 2026-03-25T15:27:40.516 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:40.516 INFO:teuthology.orchestra.run.vm04.stdout: mon_warn_on_filestore_osds = false 2026-03-25T15:27:40.516 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:40.516 INFO:teuthology.orchestra.run.vm04.stdout:[client] 2026-03-25T15:27:40.516 INFO:teuthology.orchestra.run.vm04.stdout: rgw cache enabled = true 2026-03-25T15:27:40.516 INFO:teuthology.orchestra.run.vm04.stdout: rgw enable ops log = true 2026-03-25T15:27:40.516 INFO:teuthology.orchestra.run.vm04.stdout: rgw enable usage log = true 2026-03-25T15:27:40.516 INFO:teuthology.orchestra.run.vm04.stdout: log file = /var/log/ceph/$cluster-$name.$pid.log 2026-03-25T15:27:40.516 INFO:teuthology.orchestra.run.vm04.stdout: admin socket = /var/run/ceph/$cluster-$name.$pid.asok 2026-03-25T15:27:40.516 INFO:teuthology.orchestra.run.vm04.stdout: rbd default format = 1 2026-03-25T15:27:40.516 INFO:teuthology.orchestra.run.vm04.stdout:[mon.a] 2026-03-25T15:27:40.525 INFO:tasks.ceph:Creating admin key on mon.a... 2026-03-25T15:27:40.535 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool --gen-key --name=client.admin --cap mon 'allow *' --cap osd 'allow *' --cap mds 'allow *' --cap mgr 'allow *' /etc/ceph/ceph.keyring 2026-03-25T15:27:40.597 INFO:tasks.ceph:Copying monmap to all nodes... 2026-03-25T15:27:40.598 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-25T15:27:40.598 DEBUG:teuthology.orchestra.run.vm04:> dd if=/etc/ceph/ceph.keyring of=/dev/stdout 2026-03-25T15:27:40.618 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-25T15:27:40.618 DEBUG:teuthology.orchestra.run.vm04:> dd if=/home/ubuntu/cephtest/ceph.monmap of=/dev/stdout 2026-03-25T15:27:40.679 INFO:tasks.ceph:Sending monmap to node ubuntu@vm04.local 2026-03-25T15:27:40.679 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-25T15:27:40.679 DEBUG:teuthology.orchestra.run.vm04:> sudo dd of=/etc/ceph/ceph.keyring 2026-03-25T15:27:40.679 DEBUG:teuthology.orchestra.run.vm04:> sudo chmod 0644 /etc/ceph/ceph.keyring 2026-03-25T15:27:40.766 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-25T15:27:40.766 DEBUG:teuthology.orchestra.run.vm04:> dd of=/home/ubuntu/cephtest/ceph.monmap 2026-03-25T15:27:40.825 INFO:tasks.ceph:Setting up mon nodes... 2026-03-25T15:27:40.826 INFO:tasks.ceph:Setting up mgr nodes... 2026-03-25T15:27:40.826 DEBUG:teuthology.orchestra.run.vm04:> sudo mkdir -p /var/lib/ceph/mgr/ceph-x && sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool --create-keyring --gen-key --name=mgr.x /var/lib/ceph/mgr/ceph-x/keyring 2026-03-25T15:27:40.921 INFO:teuthology.orchestra.run.vm04.stdout:creating /var/lib/ceph/mgr/ceph-x/keyring 2026-03-25T15:27:40.924 INFO:tasks.ceph:Setting up mds nodes... 2026-03-25T15:27:40.924 INFO:tasks.ceph_client:Setting up client nodes... 2026-03-25T15:27:40.924 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool --create-keyring --gen-key --name=client.0 /etc/ceph/ceph.client.0.keyring && sudo chmod 0644 /etc/ceph/ceph.client.0.keyring 2026-03-25T15:27:40.963 INFO:teuthology.orchestra.run.vm04.stdout:creating /etc/ceph/ceph.client.0.keyring 2026-03-25T15:27:40.977 INFO:tasks.ceph:Running mkfs on osd nodes... 2026-03-25T15:27:40.977 INFO:tasks.ceph:ctx.disk_config.remote_to_roles_to_dev: {Remote(name='ubuntu@vm04.local'): {'osd.0': '/dev/vg_nvme/lv_1', 'osd.1': '/dev/vg_nvme/lv_2', 'osd.2': '/dev/vg_nvme/lv_3'}} 2026-03-25T15:27:40.977 DEBUG:teuthology.orchestra.run.vm04:> sudo mkdir -p /var/lib/ceph/osd/ceph-0 2026-03-25T15:27:41.048 INFO:tasks.ceph:roles_to_devs: {'osd.0': '/dev/vg_nvme/lv_1', 'osd.1': '/dev/vg_nvme/lv_2', 'osd.2': '/dev/vg_nvme/lv_3'} 2026-03-25T15:27:41.048 INFO:tasks.ceph:role: osd.0 2026-03-25T15:27:41.048 INFO:tasks.ceph:['mkfs.xfs', '-f', '-i', 'size=2048'] on /dev/vg_nvme/lv_1 on ubuntu@vm04.local 2026-03-25T15:27:41.049 DEBUG:teuthology.orchestra.run.vm04:> yes | sudo mkfs.xfs -f -i size=2048 /dev/vg_nvme/lv_1 2026-03-25T15:27:41.117 INFO:teuthology.orchestra.run.vm04.stdout:meta-data=/dev/vg_nvme/lv_1 isize=2048 agcount=4, agsize=1310464 blks 2026-03-25T15:27:41.117 INFO:teuthology.orchestra.run.vm04.stdout: = sectsz=512 attr=2, projid32bit=1 2026-03-25T15:27:41.117 INFO:teuthology.orchestra.run.vm04.stdout: = crc=1 finobt=1, sparse=1, rmapbt=0 2026-03-25T15:27:41.117 INFO:teuthology.orchestra.run.vm04.stdout: = reflink=1 bigtime=1 inobtcount=1 nrext64=0 2026-03-25T15:27:41.117 INFO:teuthology.orchestra.run.vm04.stdout:data = bsize=4096 blocks=5241856, imaxpct=25 2026-03-25T15:27:41.117 INFO:teuthology.orchestra.run.vm04.stdout: = sunit=0 swidth=0 blks 2026-03-25T15:27:41.117 INFO:teuthology.orchestra.run.vm04.stdout:naming =version 2 bsize=4096 ascii-ci=0, ftype=1 2026-03-25T15:27:41.117 INFO:teuthology.orchestra.run.vm04.stdout:log =internal log bsize=4096 blocks=16384, version=2 2026-03-25T15:27:41.117 INFO:teuthology.orchestra.run.vm04.stdout: = sectsz=512 sunit=0 blks, lazy-count=1 2026-03-25T15:27:41.117 INFO:teuthology.orchestra.run.vm04.stdout:realtime =none extsz=4096 blocks=0, rtextents=0 2026-03-25T15:27:41.140 INFO:teuthology.orchestra.run.vm04.stdout:Discarding blocks...Done. 2026-03-25T15:27:41.154 INFO:tasks.ceph:mount /dev/vg_nvme/lv_1 on ubuntu@vm04.local -o noatime 2026-03-25T15:27:41.154 DEBUG:teuthology.orchestra.run.vm04:> sudo mount -t xfs -o noatime /dev/vg_nvme/lv_1 /var/lib/ceph/osd/ceph-0 2026-03-25T15:27:41.230 DEBUG:teuthology.orchestra.run.vm04:> sudo /sbin/restorecon /var/lib/ceph/osd/ceph-0 2026-03-25T15:27:41.300 DEBUG:teuthology.orchestra.run.vm04:> sudo mkdir -p /var/lib/ceph/osd/ceph-1 2026-03-25T15:27:41.369 INFO:tasks.ceph:roles_to_devs: {'osd.0': '/dev/vg_nvme/lv_1', 'osd.1': '/dev/vg_nvme/lv_2', 'osd.2': '/dev/vg_nvme/lv_3'} 2026-03-25T15:27:41.369 INFO:tasks.ceph:role: osd.1 2026-03-25T15:27:41.369 INFO:tasks.ceph:['mkfs.xfs', '-f', '-i', 'size=2048'] on /dev/vg_nvme/lv_2 on ubuntu@vm04.local 2026-03-25T15:27:41.369 DEBUG:teuthology.orchestra.run.vm04:> yes | sudo mkfs.xfs -f -i size=2048 /dev/vg_nvme/lv_2 2026-03-25T15:27:41.441 INFO:teuthology.orchestra.run.vm04.stdout:meta-data=/dev/vg_nvme/lv_2 isize=2048 agcount=4, agsize=1310464 blks 2026-03-25T15:27:41.441 INFO:teuthology.orchestra.run.vm04.stdout: = sectsz=512 attr=2, projid32bit=1 2026-03-25T15:27:41.441 INFO:teuthology.orchestra.run.vm04.stdout: = crc=1 finobt=1, sparse=1, rmapbt=0 2026-03-25T15:27:41.441 INFO:teuthology.orchestra.run.vm04.stdout: = reflink=1 bigtime=1 inobtcount=1 nrext64=0 2026-03-25T15:27:41.441 INFO:teuthology.orchestra.run.vm04.stdout:data = bsize=4096 blocks=5241856, imaxpct=25 2026-03-25T15:27:41.441 INFO:teuthology.orchestra.run.vm04.stdout: = sunit=0 swidth=0 blks 2026-03-25T15:27:41.441 INFO:teuthology.orchestra.run.vm04.stdout:naming =version 2 bsize=4096 ascii-ci=0, ftype=1 2026-03-25T15:27:41.441 INFO:teuthology.orchestra.run.vm04.stdout:log =internal log bsize=4096 blocks=16384, version=2 2026-03-25T15:27:41.441 INFO:teuthology.orchestra.run.vm04.stdout: = sectsz=512 sunit=0 blks, lazy-count=1 2026-03-25T15:27:41.441 INFO:teuthology.orchestra.run.vm04.stdout:realtime =none extsz=4096 blocks=0, rtextents=0 2026-03-25T15:27:41.456 INFO:teuthology.orchestra.run.vm04.stdout:Discarding blocks...Done. 2026-03-25T15:27:41.460 INFO:tasks.ceph:mount /dev/vg_nvme/lv_2 on ubuntu@vm04.local -o noatime 2026-03-25T15:27:41.460 DEBUG:teuthology.orchestra.run.vm04:> sudo mount -t xfs -o noatime /dev/vg_nvme/lv_2 /var/lib/ceph/osd/ceph-1 2026-03-25T15:27:41.538 DEBUG:teuthology.orchestra.run.vm04:> sudo /sbin/restorecon /var/lib/ceph/osd/ceph-1 2026-03-25T15:27:41.606 DEBUG:teuthology.orchestra.run.vm04:> sudo mkdir -p /var/lib/ceph/osd/ceph-2 2026-03-25T15:27:41.676 INFO:tasks.ceph:roles_to_devs: {'osd.0': '/dev/vg_nvme/lv_1', 'osd.1': '/dev/vg_nvme/lv_2', 'osd.2': '/dev/vg_nvme/lv_3'} 2026-03-25T15:27:41.676 INFO:tasks.ceph:role: osd.2 2026-03-25T15:27:41.676 INFO:tasks.ceph:['mkfs.xfs', '-f', '-i', 'size=2048'] on /dev/vg_nvme/lv_3 on ubuntu@vm04.local 2026-03-25T15:27:41.676 DEBUG:teuthology.orchestra.run.vm04:> yes | sudo mkfs.xfs -f -i size=2048 /dev/vg_nvme/lv_3 2026-03-25T15:27:41.744 INFO:teuthology.orchestra.run.vm04.stdout:meta-data=/dev/vg_nvme/lv_3 isize=2048 agcount=4, agsize=1310464 blks 2026-03-25T15:27:41.744 INFO:teuthology.orchestra.run.vm04.stdout: = sectsz=512 attr=2, projid32bit=1 2026-03-25T15:27:41.744 INFO:teuthology.orchestra.run.vm04.stdout: = crc=1 finobt=1, sparse=1, rmapbt=0 2026-03-25T15:27:41.744 INFO:teuthology.orchestra.run.vm04.stdout: = reflink=1 bigtime=1 inobtcount=1 nrext64=0 2026-03-25T15:27:41.744 INFO:teuthology.orchestra.run.vm04.stdout:data = bsize=4096 blocks=5241856, imaxpct=25 2026-03-25T15:27:41.744 INFO:teuthology.orchestra.run.vm04.stdout: = sunit=0 swidth=0 blks 2026-03-25T15:27:41.744 INFO:teuthology.orchestra.run.vm04.stdout:naming =version 2 bsize=4096 ascii-ci=0, ftype=1 2026-03-25T15:27:41.744 INFO:teuthology.orchestra.run.vm04.stdout:log =internal log bsize=4096 blocks=16384, version=2 2026-03-25T15:27:41.744 INFO:teuthology.orchestra.run.vm04.stdout: = sectsz=512 sunit=0 blks, lazy-count=1 2026-03-25T15:27:41.744 INFO:teuthology.orchestra.run.vm04.stdout:realtime =none extsz=4096 blocks=0, rtextents=0 2026-03-25T15:27:41.761 INFO:teuthology.orchestra.run.vm04.stdout:Discarding blocks...Done. 2026-03-25T15:27:41.764 INFO:tasks.ceph:mount /dev/vg_nvme/lv_3 on ubuntu@vm04.local -o noatime 2026-03-25T15:27:41.793 DEBUG:teuthology.orchestra.run.vm04:> sudo mount -t xfs -o noatime /dev/vg_nvme/lv_3 /var/lib/ceph/osd/ceph-2 2026-03-25T15:27:41.833 DEBUG:teuthology.orchestra.run.vm04:> sudo /sbin/restorecon /var/lib/ceph/osd/ceph-2 2026-03-25T15:27:41.903 DEBUG:teuthology.orchestra.run.vm04:> sudo MALLOC_CHECK_=3 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-osd --no-mon-config --cluster ceph --mkfs --mkkey -i 0 --monmap /home/ubuntu/cephtest/ceph.monmap 2026-03-25T15:27:41.990 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-25T15:27:41.988+0000 7f96155b0900 -1 auth: error reading file: /var/lib/ceph/osd/ceph-0/keyring: can't open /var/lib/ceph/osd/ceph-0/keyring: (2) No such file or directory 2026-03-25T15:27:41.991 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-25T15:27:41.988+0000 7f96155b0900 -1 created new key in keyring /var/lib/ceph/osd/ceph-0/keyring 2026-03-25T15:27:41.991 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-25T15:27:41.988+0000 7f96155b0900 -1 bdev(0x55929c8ab800 /var/lib/ceph/osd/ceph-0/block) open stat got: (1) Operation not permitted 2026-03-25T15:27:41.991 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-25T15:27:41.988+0000 7f96155b0900 -1 bluestore(/var/lib/ceph/osd/ceph-0) _read_fsid unparsable uuid 2026-03-25T15:27:43.333 DEBUG:teuthology.orchestra.run.vm04:> sudo chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 2026-03-25T15:27:43.363 DEBUG:teuthology.orchestra.run.vm04:> sudo MALLOC_CHECK_=3 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-osd --no-mon-config --cluster ceph --mkfs --mkkey -i 1 --monmap /home/ubuntu/cephtest/ceph.monmap 2026-03-25T15:27:43.448 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-25T15:27:43.445+0000 7ff753c89900 -1 auth: error reading file: /var/lib/ceph/osd/ceph-1/keyring: can't open /var/lib/ceph/osd/ceph-1/keyring: (2) No such file or directory 2026-03-25T15:27:43.448 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-25T15:27:43.445+0000 7ff753c89900 -1 created new key in keyring /var/lib/ceph/osd/ceph-1/keyring 2026-03-25T15:27:43.448 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-25T15:27:43.446+0000 7ff753c89900 -1 bdev(0x5571045c9800 /var/lib/ceph/osd/ceph-1/block) open stat got: (1) Operation not permitted 2026-03-25T15:27:43.448 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-25T15:27:43.446+0000 7ff753c89900 -1 bluestore(/var/lib/ceph/osd/ceph-1) _read_fsid unparsable uuid 2026-03-25T15:27:44.085 DEBUG:teuthology.orchestra.run.vm04:> sudo chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 2026-03-25T15:27:44.115 DEBUG:teuthology.orchestra.run.vm04:> sudo MALLOC_CHECK_=3 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-osd --no-mon-config --cluster ceph --mkfs --mkkey -i 2 --monmap /home/ubuntu/cephtest/ceph.monmap 2026-03-25T15:27:44.199 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-25T15:27:44.197+0000 7f176a1cf900 -1 auth: error reading file: /var/lib/ceph/osd/ceph-2/keyring: can't open /var/lib/ceph/osd/ceph-2/keyring: (2) No such file or directory 2026-03-25T15:27:44.200 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-25T15:27:44.197+0000 7f176a1cf900 -1 created new key in keyring /var/lib/ceph/osd/ceph-2/keyring 2026-03-25T15:27:44.200 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-25T15:27:44.197+0000 7f176a1cf900 -1 bdev(0x564a6bdcd800 /var/lib/ceph/osd/ceph-2/block) open stat got: (1) Operation not permitted 2026-03-25T15:27:44.200 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-25T15:27:44.197+0000 7f176a1cf900 -1 bluestore(/var/lib/ceph/osd/ceph-2) _read_fsid unparsable uuid 2026-03-25T15:27:44.925 DEBUG:teuthology.orchestra.run.vm04:> sudo chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 2026-03-25T15:27:44.951 INFO:tasks.ceph:Reading keys from all nodes... 2026-03-25T15:27:44.951 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-25T15:27:44.951 DEBUG:teuthology.orchestra.run.vm04:> sudo dd if=/var/lib/ceph/mgr/ceph-x/keyring of=/dev/stdout 2026-03-25T15:27:45.016 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-25T15:27:45.016 DEBUG:teuthology.orchestra.run.vm04:> sudo dd if=/var/lib/ceph/osd/ceph-0/keyring of=/dev/stdout 2026-03-25T15:27:45.084 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-25T15:27:45.085 DEBUG:teuthology.orchestra.run.vm04:> sudo dd if=/var/lib/ceph/osd/ceph-1/keyring of=/dev/stdout 2026-03-25T15:27:45.153 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-25T15:27:45.153 DEBUG:teuthology.orchestra.run.vm04:> sudo dd if=/var/lib/ceph/osd/ceph-2/keyring of=/dev/stdout 2026-03-25T15:27:45.231 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-25T15:27:45.231 DEBUG:teuthology.orchestra.run.vm04:> dd if=/etc/ceph/ceph.client.0.keyring of=/dev/stdout 2026-03-25T15:27:45.289 INFO:tasks.ceph:Adding keys to all mons... 2026-03-25T15:27:45.290 DEBUG:teuthology.orchestra.run.vm04:> sudo tee -a /etc/ceph/ceph.keyring 2026-03-25T15:27:45.355 INFO:teuthology.orchestra.run.vm04.stdout:[mgr.x] 2026-03-25T15:27:45.355 INFO:teuthology.orchestra.run.vm04.stdout: key = AQDs/sNpGODcNhAAhIsn1FAENkZVcUZ5sFl93g== 2026-03-25T15:27:45.355 INFO:teuthology.orchestra.run.vm04.stdout:[osd.0] 2026-03-25T15:27:45.355 INFO:teuthology.orchestra.run.vm04.stdout: key = AQDt/sNp7vX3OhAAuFEfSkNWXlI62O5RXB0D9A== 2026-03-25T15:27:45.355 INFO:teuthology.orchestra.run.vm04.stdout:[osd.1] 2026-03-25T15:27:45.355 INFO:teuthology.orchestra.run.vm04.stdout: key = AQDv/sNpq3uhGhAAILPMw146xTWgaTceyrMGVA== 2026-03-25T15:27:45.355 INFO:teuthology.orchestra.run.vm04.stdout:[osd.2] 2026-03-25T15:27:45.355 INFO:teuthology.orchestra.run.vm04.stdout: key = AQDw/sNpsYTTCxAAEBCBIS+DP/ctFJ+eAJMY3w== 2026-03-25T15:27:45.355 INFO:teuthology.orchestra.run.vm04.stdout:[client.0] 2026-03-25T15:27:45.355 INFO:teuthology.orchestra.run.vm04.stdout: key = AQDs/sNp6t1dORAA2T6yGXFrAWj2ja61fC6Iqg== 2026-03-25T15:27:45.357 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool /etc/ceph/ceph.keyring --name=mgr.x --cap mon 'allow profile mgr' --cap osd 'allow *' --cap mds 'allow *' 2026-03-25T15:27:45.444 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool /etc/ceph/ceph.keyring --name=osd.0 --cap mon 'allow profile osd' --cap mgr 'allow profile osd' --cap osd 'allow *' 2026-03-25T15:27:45.491 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool /etc/ceph/ceph.keyring --name=osd.1 --cap mon 'allow profile osd' --cap mgr 'allow profile osd' --cap osd 'allow *' 2026-03-25T15:27:45.539 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool /etc/ceph/ceph.keyring --name=osd.2 --cap mon 'allow profile osd' --cap mgr 'allow profile osd' --cap osd 'allow *' 2026-03-25T15:27:45.584 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool /etc/ceph/ceph.keyring --name=client.0 --cap mon 'allow rw' --cap mgr 'allow r' --cap osd 'allow rwx' --cap mds allow 2026-03-25T15:27:45.635 INFO:tasks.ceph:Running mkfs on mon nodes... 2026-03-25T15:27:45.635 DEBUG:teuthology.orchestra.run.vm04:> sudo mkdir -p /var/lib/ceph/mon/ceph-a 2026-03-25T15:27:45.658 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-mon --cluster ceph --mkfs -i a --monmap /home/ubuntu/cephtest/ceph.monmap --keyring /etc/ceph/ceph.keyring 2026-03-25T15:27:46.325 DEBUG:teuthology.orchestra.run.vm04:> sudo chown -R ceph:ceph /var/lib/ceph/mon/ceph-a 2026-03-25T15:27:46.352 DEBUG:teuthology.orchestra.run.vm04:> rm -- /home/ubuntu/cephtest/ceph.monmap 2026-03-25T15:27:46.416 INFO:tasks.ceph:Starting mon daemons in cluster ceph... 2026-03-25T15:27:46.416 INFO:tasks.ceph.mon.a:Restarting daemon 2026-03-25T15:27:46.416 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-mon -f --cluster ceph -i a 2026-03-25T15:27:46.457 INFO:tasks.ceph.mon.a:Started 2026-03-25T15:27:46.457 INFO:tasks.ceph:Starting mgr daemons in cluster ceph... 2026-03-25T15:27:46.457 INFO:tasks.ceph.mgr.x:Restarting daemon 2026-03-25T15:27:46.458 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-mgr -f --cluster ceph -i x 2026-03-25T15:27:46.459 INFO:tasks.ceph.mgr.x:Started 2026-03-25T15:27:46.459 DEBUG:tasks.ceph:set 0 configs 2026-03-25T15:27:46.459 DEBUG:teuthology.orchestra.run.vm04:> sudo ceph --cluster ceph config dump 2026-03-25T15:27:46.936 INFO:teuthology.orchestra.run.vm04.stdout:WHO MASK LEVEL OPTION VALUE RO 2026-03-25T15:27:46.951 INFO:tasks.ceph:Setting crush tunables to default 2026-03-25T15:27:46.951 DEBUG:teuthology.orchestra.run.vm04:> sudo ceph --cluster ceph osd crush tunables default 2026-03-25T15:27:47.132 INFO:teuthology.orchestra.run.vm04.stderr:adjusted tunables profile to default 2026-03-25T15:27:47.148 INFO:tasks.ceph:check_enable_crimson: False 2026-03-25T15:27:47.148 INFO:tasks.ceph:Starting osd daemons in cluster ceph... 2026-03-25T15:27:47.149 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-25T15:27:47.149 DEBUG:teuthology.orchestra.run.vm04:> sudo dd if=/var/lib/ceph/osd/ceph-0/fsid of=/dev/stdout 2026-03-25T15:27:47.181 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-25T15:27:47.181 DEBUG:teuthology.orchestra.run.vm04:> sudo dd if=/var/lib/ceph/osd/ceph-1/fsid of=/dev/stdout 2026-03-25T15:27:47.251 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-25T15:27:47.251 DEBUG:teuthology.orchestra.run.vm04:> sudo dd if=/var/lib/ceph/osd/ceph-2/fsid of=/dev/stdout 2026-03-25T15:27:47.324 DEBUG:teuthology.orchestra.run.vm04:> sudo ceph --cluster ceph osd new 4f49ea48-ae3c-407d-a183-56efa44295aa 0 2026-03-25T15:27:47.515 INFO:teuthology.orchestra.run.vm04.stdout:0 2026-03-25T15:27:47.527 DEBUG:teuthology.orchestra.run.vm04:> sudo ceph --cluster ceph osd new f2612533-a570-4836-a45a-2ae6235f25d4 1 2026-03-25T15:27:47.676 INFO:teuthology.orchestra.run.vm04.stdout:1 2026-03-25T15:27:47.691 DEBUG:teuthology.orchestra.run.vm04:> sudo ceph --cluster ceph osd new e252769d-790d-4ead-9e7c-79a5105f87f1 2 2026-03-25T15:27:47.831 INFO:tasks.ceph.mgr.x.vm04.stderr:/usr/lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode. 2026-03-25T15:27:47.832 INFO:tasks.ceph.mgr.x.vm04.stderr:Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve. 2026-03-25T15:27:47.832 INFO:tasks.ceph.mgr.x.vm04.stderr: from numpy import show_config as show_numpy_config 2026-03-25T15:27:47.844 INFO:teuthology.orchestra.run.vm04.stdout:2 2026-03-25T15:27:47.860 INFO:tasks.ceph.osd.0:Restarting daemon 2026-03-25T15:27:47.860 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 0 2026-03-25T15:27:47.862 INFO:tasks.ceph.osd.0:Started 2026-03-25T15:27:47.862 INFO:tasks.ceph.osd.1:Restarting daemon 2026-03-25T15:27:47.862 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 1 2026-03-25T15:27:47.864 INFO:tasks.ceph.osd.1:Started 2026-03-25T15:27:47.865 INFO:tasks.ceph.osd.2:Restarting daemon 2026-03-25T15:27:47.865 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 2 2026-03-25T15:27:47.868 INFO:tasks.ceph.osd.2:Started 2026-03-25T15:27:47.868 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd dump --format=json 2026-03-25T15:27:48.040 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:48.040 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":5,"fsid":"08196b8a-fd91-49b2-b8a6-e1d21f829086","created":"2026-03-25T15:27:46.739365+0000","modified":"2026-03-25T15:27:47.831101+0000","last_up_change":"0.000000","last_in_change":"2026-03-25T15:27:47.831101+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":2,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":0,"max_osd":3,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"tentacle","allow_crimson":false,"pools":[],"osds":[{"osd":0,"uuid":"4f49ea48-ae3c-407d-a183-56efa44295aa","up":0,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":0,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[]},"cluster_addrs":{"addrvec":[]},"heartbeat_back_addrs":{"addrvec":[]},"heartbeat_front_addrs":{"addrvec":[]},"public_addr":"(unrecognized address family 0)/0","cluster_addr":"(unrecognized address family 0)/0","heartbeat_back_addr":"(unrecognized address family 0)/0","heartbeat_front_addr":"(unrecognized address family 0)/0","state":["exists","new"]},{"osd":1,"uuid":"f2612533-a570-4836-a45a-2ae6235f25d4","up":0,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":0,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[]},"cluster_addrs":{"addrvec":[]},"heartbeat_back_addrs":{"addrvec":[]},"heartbeat_front_addrs":{"addrvec":[]},"public_addr":"(unrecognized address family 0)/0","cluster_addr":"(unrecognized address family 0)/0","heartbeat_back_addr":"(unrecognized address family 0)/0","heartbeat_front_addr":"(unrecognized address family 0)/0","state":["exists","new"]},{"osd":2,"uuid":"e252769d-790d-4ead-9e7c-79a5105f87f1","up":0,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":0,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[]},"cluster_addrs":{"addrvec":[]},"heartbeat_back_addrs":{"addrvec":[]},"heartbeat_front_addrs":{"addrvec":[]},"public_addr":"(unrecognized address family 0)/0","cluster_addr":"(unrecognized address family 0)/0","heartbeat_back_addr":"(unrecognized address family 0)/0","heartbeat_front_addr":"(unrecognized address family 0)/0","state":["exists","new"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":0,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":0,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":0,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"isa","technique":"reed_sol_van"}},"removed_snaps_queue":[],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-25T15:27:48.053 INFO:tasks.ceph.ceph_manager.ceph:[] 2026-03-25T15:27:48.054 INFO:tasks.ceph:Waiting for OSDs to come up 2026-03-25T15:27:48.078 INFO:tasks.ceph.osd.2.vm04.stderr:2026-03-25T15:27:48.075+0000 7f18772c8900 -1 Falling back to public interface 2026-03-25T15:27:48.092 INFO:tasks.ceph.osd.1.vm04.stderr:2026-03-25T15:27:48.089+0000 7fbe78213900 -1 Falling back to public interface 2026-03-25T15:27:48.100 INFO:tasks.ceph.osd.0.vm04.stderr:2026-03-25T15:27:48.097+0000 7ffa2dc67900 -1 Falling back to public interface 2026-03-25T15:27:48.355 DEBUG:teuthology.orchestra.run.vm04:> adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph --cluster ceph osd dump --format=json 2026-03-25T15:27:48.675 INFO:teuthology.misc.health.vm04.stdout: 2026-03-25T15:27:48.675 INFO:teuthology.misc.health.vm04.stdout:{"epoch":5,"fsid":"08196b8a-fd91-49b2-b8a6-e1d21f829086","created":"2026-03-25T15:27:46.739365+0000","modified":"2026-03-25T15:27:47.831101+0000","last_up_change":"0.000000","last_in_change":"2026-03-25T15:27:47.831101+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":2,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":0,"max_osd":3,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"tentacle","allow_crimson":false,"pools":[],"osds":[{"osd":0,"uuid":"4f49ea48-ae3c-407d-a183-56efa44295aa","up":0,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":0,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[]},"cluster_addrs":{"addrvec":[]},"heartbeat_back_addrs":{"addrvec":[]},"heartbeat_front_addrs":{"addrvec":[]},"public_addr":"(unrecognized address family 0)/0","cluster_addr":"(unrecognized address family 0)/0","heartbeat_back_addr":"(unrecognized address family 0)/0","heartbeat_front_addr":"(unrecognized address family 0)/0","state":["exists","new"]},{"osd":1,"uuid":"f2612533-a570-4836-a45a-2ae6235f25d4","up":0,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":0,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[]},"cluster_addrs":{"addrvec":[]},"heartbeat_back_addrs":{"addrvec":[]},"heartbeat_front_addrs":{"addrvec":[]},"public_addr":"(unrecognized address family 0)/0","cluster_addr":"(unrecognized address family 0)/0","heartbeat_back_addr":"(unrecognized address family 0)/0","heartbeat_front_addr":"(unrecognized address family 0)/0","state":["exists","new"]},{"osd":2,"uuid":"e252769d-790d-4ead-9e7c-79a5105f87f1","up":0,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":0,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[]},"cluster_addrs":{"addrvec":[]},"heartbeat_back_addrs":{"addrvec":[]},"heartbeat_front_addrs":{"addrvec":[]},"public_addr":"(unrecognized address family 0)/0","cluster_addr":"(unrecognized address family 0)/0","heartbeat_back_addr":"(unrecognized address family 0)/0","heartbeat_front_addr":"(unrecognized address family 0)/0","state":["exists","new"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":0,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":0,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":0,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"isa","technique":"reed_sol_van"}},"removed_snaps_queue":[],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-25T15:27:48.684 INFO:tasks.ceph.osd.2.vm04.stderr:2026-03-25T15:27:48.675+0000 7f18772c8900 -1 osd.2 0 log_to_monitors true 2026-03-25T15:27:48.698 DEBUG:teuthology.misc:0 of 3 OSDs are up 2026-03-25T15:27:48.970 INFO:tasks.ceph.osd.1.vm04.stderr:2026-03-25T15:27:48.968+0000 7fbe78213900 -1 osd.1 0 log_to_monitors true 2026-03-25T15:27:48.971 INFO:tasks.ceph.osd.0.vm04.stderr:2026-03-25T15:27:48.968+0000 7ffa2dc67900 -1 osd.0 0 log_to_monitors true 2026-03-25T15:27:49.948 INFO:tasks.ceph.osd.2.vm04.stderr:2026-03-25T15:27:49.945+0000 7f1873259640 -1 osd.2 0 waiting for initial osdmap 2026-03-25T15:27:49.964 INFO:tasks.ceph.osd.2.vm04.stderr:2026-03-25T15:27:49.962+0000 7f186e05e640 -1 osd.2 7 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-25T15:27:50.844 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-25T15:27:50.841+0000 7fea67bad640 -1 mgr.server handle_report got status from non-daemon mon.a 2026-03-25T15:27:50.952 INFO:tasks.ceph.osd.1.vm04.stderr:2026-03-25T15:27:50.949+0000 7fbe738a3640 -1 osd.1 0 waiting for initial osdmap 2026-03-25T15:27:50.952 INFO:tasks.ceph.osd.0.vm04.stderr:2026-03-25T15:27:50.949+0000 7ffa29bf6640 -1 osd.0 0 waiting for initial osdmap 2026-03-25T15:27:50.964 INFO:tasks.ceph.osd.1.vm04.stderr:2026-03-25T15:27:50.962+0000 7fbe6e6a8640 -1 osd.1 8 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-25T15:27:50.965 INFO:tasks.ceph.osd.0.vm04.stderr:2026-03-25T15:27:50.962+0000 7ffa249fb640 -1 osd.0 8 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-25T15:27:55.000 DEBUG:teuthology.orchestra.run.vm04:> adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph --cluster ceph osd dump --format=json 2026-03-25T15:27:55.230 INFO:teuthology.misc.health.vm04.stdout: 2026-03-25T15:27:55.230 INFO:teuthology.misc.health.vm04.stdout:{"epoch":11,"fsid":"08196b8a-fd91-49b2-b8a6-e1d21f829086","created":"2026-03-25T15:27:46.739365+0000","modified":"2026-03-25T15:27:54.962338+0000","last_up_change":"2026-03-25T15:27:51.947587+0000","last_in_change":"2026-03-25T15:27:47.831101+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":5,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":1,"max_osd":3,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"tentacle","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-25T15:27:53.853201+0000","flags":32769,"flags_names":"hashpspool,creating","type":1,"size":2,"min_size":1,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"is_stretch_pool":false,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"11","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"nonprimary_shards":"{}","options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_type":"Fair distribution","score_acting":2.9900000095367432,"score_stable":2.9900000095367432,"optimal_score":0.67000001668930054,"raw_score_acting":2,"raw_score_stable":2,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"4f49ea48-ae3c-407d-a183-56efa44295aa","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":9,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6816","nonce":2768043570},{"type":"v1","addr":"192.168.123.104:6817","nonce":2768043570}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6818","nonce":2768043570},{"type":"v1","addr":"192.168.123.104:6819","nonce":2768043570}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6822","nonce":2768043570},{"type":"v1","addr":"192.168.123.104:6823","nonce":2768043570}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6820","nonce":2768043570},{"type":"v1","addr":"192.168.123.104:6821","nonce":2768043570}]},"public_addr":"192.168.123.104:6817/2768043570","cluster_addr":"192.168.123.104:6819/2768043570","heartbeat_back_addr":"192.168.123.104:6823/2768043570","heartbeat_front_addr":"192.168.123.104:6821/2768043570","state":["exists","up"]},{"osd":1,"uuid":"f2612533-a570-4836-a45a-2ae6235f25d4","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":9,"up_thru":10,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6808","nonce":2830815940},{"type":"v1","addr":"192.168.123.104:6809","nonce":2830815940}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6810","nonce":2830815940},{"type":"v1","addr":"192.168.123.104:6811","nonce":2830815940}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6814","nonce":2830815940},{"type":"v1","addr":"192.168.123.104:6815","nonce":2830815940}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6812","nonce":2830815940},{"type":"v1","addr":"192.168.123.104:6813","nonce":2830815940}]},"public_addr":"192.168.123.104:6809/2830815940","cluster_addr":"192.168.123.104:6811/2830815940","heartbeat_back_addr":"192.168.123.104:6815/2830815940","heartbeat_front_addr":"192.168.123.104:6813/2830815940","state":["exists","up"]},{"osd":2,"uuid":"e252769d-790d-4ead-9e7c-79a5105f87f1","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6800","nonce":2246359187},{"type":"v1","addr":"192.168.123.104:6801","nonce":2246359187}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6802","nonce":2246359187},{"type":"v1","addr":"192.168.123.104:6803","nonce":2246359187}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6806","nonce":2246359187},{"type":"v1","addr":"192.168.123.104:6807","nonce":2246359187}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6804","nonce":2246359187},{"type":"v1","addr":"192.168.123.104:6805","nonce":2246359187}]},"public_addr":"192.168.123.104:6801/2246359187","cluster_addr":"192.168.123.104:6803/2246359187","heartbeat_back_addr":"192.168.123.104:6807/2246359187","heartbeat_front_addr":"192.168.123.104:6805/2246359187","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"isa","technique":"reed_sol_van"}},"removed_snaps_queue":[],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-25T15:27:55.240 DEBUG:teuthology.misc:3 of 3 OSDs are up 2026-03-25T15:27:55.240 INFO:tasks.ceph:Creating RBD pool 2026-03-25T15:27:55.240 DEBUG:teuthology.orchestra.run.vm04:> sudo ceph --cluster ceph osd pool create rbd 8 2026-03-25T15:27:55.972 INFO:teuthology.orchestra.run.vm04.stderr:pool 'rbd' created 2026-03-25T15:27:55.990 DEBUG:teuthology.orchestra.run.vm04:> rbd --cluster ceph pool init rbd 2026-03-25T15:27:58.993 INFO:tasks.ceph:Starting mds daemons in cluster ceph... 2026-03-25T15:27:58.993 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph config log 1 --format=json 2026-03-25T15:27:58.993 INFO:tasks.daemonwatchdog.daemon_watchdog:watchdog starting 2026-03-25T15:27:59.203 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:59.215 INFO:teuthology.orchestra.run.vm04.stdout:[{"version":1,"timestamp":"0.000000","name":"","changes":[]}] 2026-03-25T15:27:59.215 INFO:tasks.ceph_manager:config epoch is 1 2026-03-25T15:27:59.215 INFO:tasks.ceph:Waiting until ceph daemons up and pgs clean... 2026-03-25T15:27:59.215 INFO:tasks.ceph.ceph_manager.ceph:waiting for mgr available 2026-03-25T15:27:59.215 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph mgr dump --format=json 2026-03-25T15:27:59.455 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:59.467 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":5,"flags":0,"active_gid":4100,"active_name":"x","active_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6824","nonce":408510450},{"type":"v1","addr":"192.168.123.104:6825","nonce":408510450}]},"active_addr":"192.168.123.104:6825/408510450","active_change":"2026-03-25T15:27:49.831871+0000","active_mgr_features":4544132024016699391,"available":true,"standbys":[],"modules":["iostat","nfs"],"available_modules":[{"name":"alerts","can_run":true,"error_string":"","module_options":{"interval":{"name":"interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"How frequently to reexamine health status","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"smtp_destination":{"name":"smtp_destination","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Email address to send alerts to, use commas to separate multiple","long_desc":"","tags":[],"see_also":[]},"smtp_from_name":{"name":"smtp_from_name","type":"str","level":"advanced","flags":1,"default_value":"Ceph","min":"","max":"","enum_allowed":[],"desc":"Email From: name","long_desc":"","tags":[],"see_also":[]},"smtp_host":{"name":"smtp_host","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_password":{"name":"smtp_password","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Password to authenticate with","long_desc":"","tags":[],"see_also":[]},"smtp_port":{"name":"smtp_port","type":"int","level":"advanced","flags":1,"default_value":"465","min":"","max":"","enum_allowed":[],"desc":"SMTP port","long_desc":"","tags":[],"see_also":[]},"smtp_sender":{"name":"smtp_sender","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP envelope sender","long_desc":"","tags":[],"see_also":[]},"smtp_ssl":{"name":"smtp_ssl","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Use SSL to connect to SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_user":{"name":"smtp_user","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"User to authenticate as","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"balancer","can_run":true,"error_string":"","module_options":{"active":{"name":"active","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"automatically balance PGs across cluster","long_desc":"","tags":[],"see_also":[]},"begin_time":{"name":"begin_time","type":"str","level":"advanced","flags":1,"default_value":"0000","min":"","max":"","enum_allowed":[],"desc":"beginning time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"begin_weekday":{"name":"begin_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to this day of the week or later","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"crush_compat_max_iterations":{"name":"crush_compat_max_iterations","type":"uint","level":"advanced","flags":1,"default_value":"25","min":"1","max":"250","enum_allowed":[],"desc":"maximum number of iterations to attempt optimization","long_desc":"","tags":[],"see_also":[]},"crush_compat_metrics":{"name":"crush_compat_metrics","type":"str","level":"advanced","flags":1,"default_value":"pgs,objects,bytes","min":"","max":"","enum_allowed":[],"desc":"metrics with which to calculate OSD utilization","long_desc":"Value is a list of one or more of \"pgs\", \"objects\", or \"bytes\", and indicates which metrics to use to balance utilization.","tags":[],"see_also":[]},"crush_compat_step":{"name":"crush_compat_step","type":"float","level":"advanced","flags":1,"default_value":"0.5","min":"0.001","max":"0.999","enum_allowed":[],"desc":"aggressiveness of optimization","long_desc":".99 is very aggressive, .01 is less aggressive","tags":[],"see_also":[]},"end_time":{"name":"end_time","type":"str","level":"advanced","flags":1,"default_value":"2359","min":"","max":"","enum_allowed":[],"desc":"ending time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"end_weekday":{"name":"end_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to days of the week earlier than this","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_score":{"name":"min_score","type":"float","level":"advanced","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"minimum score, below which no optimization is attempted","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":1,"default_value":"upmap","min":"","max":"","enum_allowed":["crush-compat","none","read","upmap","upmap-read"],"desc":"Balancer mode","long_desc":"","tags":[],"see_also":[]},"pool_ids":{"name":"pool_ids","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"pools which the automatic balancing will be limited to","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and attempt optimization","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"update_pg_upmap_activity":{"name":"update_pg_upmap_activity","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Updates pg_upmap activity stats to be used in `balancer status detail`","long_desc":"","tags":[],"see_also":[]},"upmap_max_deviation":{"name":"upmap_max_deviation","type":"int","level":"advanced","flags":1,"default_value":"5","min":"1","max":"","enum_allowed":[],"desc":"deviation below which no optimization is attempted","long_desc":"If the number of PGs are within this count then no optimization is attempted","tags":[],"see_also":[]},"upmap_max_optimizations":{"name":"upmap_max_optimizations","type":"uint","level":"advanced","flags":1,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"maximum upmap optimizations to make per attempt","long_desc":"","tags":[],"see_also":[]}}},{"name":"cephadm","can_run":true,"error_string":"","module_options":{"agent_down_multiplier":{"name":"agent_down_multiplier","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"","max":"","enum_allowed":[],"desc":"Multiplied by agent refresh rate to calculate how long agent must not report before being marked down","long_desc":"","tags":[],"see_also":[]},"agent_refresh_rate":{"name":"agent_refresh_rate","type":"secs","level":"advanced","flags":0,"default_value":"20","min":"","max":"","enum_allowed":[],"desc":"How often agent on each host will try to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"agent_starting_port":{"name":"agent_starting_port","type":"int","level":"advanced","flags":0,"default_value":"4721","min":"","max":"","enum_allowed":[],"desc":"First port agent will try to bind to (will also try up to next 1000 subsequent ports if blocked)","long_desc":"","tags":[],"see_also":[]},"allow_ptrace":{"name":"allow_ptrace","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow SYS_PTRACE capability on ceph containers","long_desc":"The SYS_PTRACE capability is needed to attach to a process with gdb or strace. Enabling this options can allow debugging daemons that encounter problems at runtime.","tags":[],"see_also":[]},"autotune_interval":{"name":"autotune_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to autotune daemon memory","long_desc":"","tags":[],"see_also":[]},"autotune_memory_target_ratio":{"name":"autotune_memory_target_ratio","type":"float","level":"advanced","flags":0,"default_value":"0.7","min":"","max":"","enum_allowed":[],"desc":"ratio of total system memory to divide amongst autotuned daemons","long_desc":"","tags":[],"see_also":[]},"cephadm_log_destination":{"name":"cephadm_log_destination","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":["file","file,syslog","syslog"],"desc":"Destination for cephadm command's persistent logging","long_desc":"","tags":[],"see_also":[]},"certificate_automated_rotation_enabled":{"name":"certificate_automated_rotation_enabled","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"This flag controls whether cephadm automatically rotates certificates upon expiration.","long_desc":"","tags":[],"see_also":[]},"certificate_check_debug_mode":{"name":"certificate_check_debug_mode","type":"bool","level":"dev","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"FOR TESTING ONLY: This flag forces the certificate check instead of waiting for certificate_check_period.","long_desc":"","tags":[],"see_also":[]},"certificate_check_period":{"name":"certificate_check_period","type":"int","level":"advanced","flags":0,"default_value":"1","min":"0","max":"30","enum_allowed":[],"desc":"Specifies how often (in days) the certificate should be checked for validity.","long_desc":"","tags":[],"see_also":[]},"certificate_duration_days":{"name":"certificate_duration_days","type":"int","level":"advanced","flags":0,"default_value":"1095","min":"90","max":"3650","enum_allowed":[],"desc":"Specifies the duration of self certificates generated and signed by cephadm root CA","long_desc":"","tags":[],"see_also":[]},"certificate_renewal_threshold_days":{"name":"certificate_renewal_threshold_days","type":"int","level":"advanced","flags":0,"default_value":"30","min":"10","max":"90","enum_allowed":[],"desc":"Specifies the lead time in days to initiate certificate renewal before expiration.","long_desc":"","tags":[],"see_also":[]},"cgroups_split":{"name":"cgroups_split","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Pass --cgroups=split when cephadm creates containers (currently podman only)","long_desc":"","tags":[],"see_also":[]},"config_checks_enabled":{"name":"config_checks_enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable or disable the cephadm configuration analysis","long_desc":"","tags":[],"see_also":[]},"config_dashboard":{"name":"config_dashboard","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"manage configs like API endpoints in Dashboard.","long_desc":"","tags":[],"see_also":[]},"container_image_alertmanager":{"name":"container_image_alertmanager","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/alertmanager:v0.28.1","min":"","max":"","enum_allowed":[],"desc":"Alertmanager container image","long_desc":"","tags":[],"see_also":[]},"container_image_base":{"name":"container_image_base","type":"str","level":"advanced","flags":1,"default_value":"quay.io/ceph/ceph","min":"","max":"","enum_allowed":[],"desc":"Container image name, without the tag","long_desc":"","tags":[],"see_also":[]},"container_image_elasticsearch":{"name":"container_image_elasticsearch","type":"str","level":"advanced","flags":0,"default_value":"quay.io/omrizeneva/elasticsearch:6.8.23","min":"","max":"","enum_allowed":[],"desc":"Elasticsearch container image","long_desc":"","tags":[],"see_also":[]},"container_image_grafana":{"name":"container_image_grafana","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/grafana:12.3.1","min":"","max":"","enum_allowed":[],"desc":"Grafana container image","long_desc":"","tags":[],"see_also":[]},"container_image_haproxy":{"name":"container_image_haproxy","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/haproxy:2.3","min":"","max":"","enum_allowed":[],"desc":"Haproxy container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_agent":{"name":"container_image_jaeger_agent","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-agent:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger agent container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_collector":{"name":"container_image_jaeger_collector","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-collector:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger collector container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_query":{"name":"container_image_jaeger_query","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-query:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger query container image","long_desc":"","tags":[],"see_also":[]},"container_image_keepalived":{"name":"container_image_keepalived","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/keepalived:2.2.4","min":"","max":"","enum_allowed":[],"desc":"Keepalived container image","long_desc":"","tags":[],"see_also":[]},"container_image_loki":{"name":"container_image_loki","type":"str","level":"advanced","flags":0,"default_value":"docker.io/grafana/loki:3.0.0","min":"","max":"","enum_allowed":[],"desc":"Loki container image","long_desc":"","tags":[],"see_also":[]},"container_image_nginx":{"name":"container_image_nginx","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/nginx:sclorg-nginx-126","min":"","max":"","enum_allowed":[],"desc":"Nginx container image","long_desc":"","tags":[],"see_also":[]},"container_image_node_exporter":{"name":"container_image_node_exporter","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/node-exporter:v1.9.1","min":"","max":"","enum_allowed":[],"desc":"Node exporter container image","long_desc":"","tags":[],"see_also":[]},"container_image_nvmeof":{"name":"container_image_nvmeof","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/nvmeof:1.5","min":"","max":"","enum_allowed":[],"desc":"Nvmeof container image","long_desc":"","tags":[],"see_also":[]},"container_image_oauth2_proxy":{"name":"container_image_oauth2_proxy","type":"str","level":"advanced","flags":0,"default_value":"quay.io/oauth2-proxy/oauth2-proxy:v7.6.0","min":"","max":"","enum_allowed":[],"desc":"Oauth2 proxy container image","long_desc":"","tags":[],"see_also":[]},"container_image_prometheus":{"name":"container_image_prometheus","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/prometheus:v3.6.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_promtail":{"name":"container_image_promtail","type":"str","level":"advanced","flags":0,"default_value":"docker.io/grafana/promtail:3.0.0","min":"","max":"","enum_allowed":[],"desc":"Promtail container image","long_desc":"","tags":[],"see_also":[]},"container_image_samba":{"name":"container_image_samba","type":"str","level":"advanced","flags":0,"default_value":"quay.io/samba.org/samba-server:ceph20-centos-amd64","min":"","max":"","enum_allowed":[],"desc":"Samba container image","long_desc":"","tags":[],"see_also":[]},"container_image_samba_metrics":{"name":"container_image_samba_metrics","type":"str","level":"advanced","flags":0,"default_value":"quay.io/samba.org/samba-metrics:ceph20-centos-amd64","min":"","max":"","enum_allowed":[],"desc":"Samba metrics container image","long_desc":"","tags":[],"see_also":[]},"container_image_snmp_gateway":{"name":"container_image_snmp_gateway","type":"str","level":"advanced","flags":0,"default_value":"docker.io/maxwo/snmp-notifier:v1.2.1","min":"","max":"","enum_allowed":[],"desc":"Snmp gateway container image","long_desc":"","tags":[],"see_also":[]},"container_init":{"name":"container_init","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Run podman/docker with `--init`","long_desc":"","tags":[],"see_also":[]},"daemon_cache_timeout":{"name":"daemon_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"seconds to cache service (daemon) inventory","long_desc":"","tags":[],"see_also":[]},"default_cephadm_command_timeout":{"name":"default_cephadm_command_timeout","type":"int","level":"advanced","flags":0,"default_value":"900","min":"","max":"","enum_allowed":[],"desc":"Default timeout applied to cephadm commands run directly on the host (in seconds)","long_desc":"","tags":[],"see_also":[]},"default_registry":{"name":"default_registry","type":"str","level":"advanced","flags":0,"default_value":"quay.io","min":"","max":"","enum_allowed":[],"desc":"Search-registry to which we should normalize unqualified image names. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"device_cache_timeout":{"name":"device_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"1800","min":"","max":"","enum_allowed":[],"desc":"seconds to cache device inventory","long_desc":"","tags":[],"see_also":[]},"device_enhanced_scan":{"name":"device_enhanced_scan","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use libstoragemgmt during device scans","long_desc":"","tags":[],"see_also":[]},"facts_cache_timeout":{"name":"facts_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"seconds to cache host facts data","long_desc":"","tags":[],"see_also":[]},"grafana_dashboards_path":{"name":"grafana_dashboards_path","type":"str","level":"advanced","flags":0,"default_value":"/etc/grafana/dashboards/ceph-dashboard/","min":"","max":"","enum_allowed":[],"desc":"location of dashboards to include in grafana deployments","long_desc":"","tags":[],"see_also":[]},"host_check_interval":{"name":"host_check_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to perform a host check","long_desc":"","tags":[],"see_also":[]},"hw_monitoring":{"name":"hw_monitoring","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Deploy hw monitoring daemon on every host.","long_desc":"","tags":[],"see_also":[]},"inventory_list_all":{"name":"inventory_list_all","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Whether ceph-volume inventory should report more devices (mostly mappers (LVs / mpaths), partitions...)","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_refresh_metadata":{"name":"log_refresh_metadata","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Log all refresh metadata. Includes daemon, device, and host info collected regularly. Only has effect if logging at debug level","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"log to the \"cephadm\" cluster log channel\"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf":{"name":"manage_etc_ceph_ceph_conf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Manage and own /etc/ceph/ceph.conf on the hosts.","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf_hosts":{"name":"manage_etc_ceph_ceph_conf_hosts","type":"str","level":"advanced","flags":0,"default_value":"*","min":"","max":"","enum_allowed":[],"desc":"PlacementSpec describing on which hosts to manage /etc/ceph/ceph.conf","long_desc":"","tags":[],"see_also":[]},"max_count_per_host":{"name":"max_count_per_host","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of daemons per service per host","long_desc":"","tags":[],"see_also":[]},"max_osd_draining_count":{"name":"max_osd_draining_count","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of osds that will be drained simultaneously when osds are removed","long_desc":"","tags":[],"see_also":[]},"migration_current":{"name":"migration_current","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"internal - do not modify","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":0,"default_value":"root","min":"","max":"","enum_allowed":["cephadm-package","root"],"desc":"mode for remote execution of cephadm","long_desc":"","tags":[],"see_also":[]},"oob_default_addr":{"name":"oob_default_addr","type":"str","level":"advanced","flags":0,"default_value":"169.254.1.1","min":"","max":"","enum_allowed":[],"desc":"Default address for RedFish API (oob management).","long_desc":"","tags":[],"see_also":[]},"prometheus_alerts_path":{"name":"prometheus_alerts_path","type":"str","level":"advanced","flags":0,"default_value":"/etc/prometheus/ceph/ceph_default_alerts.yml","min":"","max":"","enum_allowed":[],"desc":"location of alerts to include in prometheus deployments","long_desc":"","tags":[],"see_also":[]},"registry_insecure":{"name":"registry_insecure","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Registry is to be considered insecure (no TLS available). Only for development purposes.","long_desc":"","tags":[],"see_also":[]},"registry_password":{"name":"registry_password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository password. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"registry_url":{"name":"registry_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Registry url for login purposes. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"registry_username":{"name":"registry_username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository username. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"secure_monitoring_stack":{"name":"secure_monitoring_stack","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable TLS security for all the monitoring stack daemons","long_desc":"","tags":[],"see_also":[]},"service_discovery_port":{"name":"service_discovery_port","type":"int","level":"advanced","flags":0,"default_value":"8765","min":"","max":"","enum_allowed":[],"desc":"cephadm service discovery port","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssh_config_file":{"name":"ssh_config_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"customized SSH config file to connect to managed hosts","long_desc":"","tags":[],"see_also":[]},"ssh_keepalive_count_max":{"name":"ssh_keepalive_count_max","type":"int","level":"advanced","flags":0,"default_value":"3","min":"","max":"","enum_allowed":[],"desc":"How many times ssh connections can fail liveness checks before the host is marked offline","long_desc":"","tags":[],"see_also":[]},"ssh_keepalive_interval":{"name":"ssh_keepalive_interval","type":"int","level":"advanced","flags":0,"default_value":"7","min":"","max":"","enum_allowed":[],"desc":"How often ssh connections are checked for liveness","long_desc":"","tags":[],"see_also":[]},"stray_daemon_check_interval":{"name":"stray_daemon_check_interval","type":"secs","level":"advanced","flags":0,"default_value":"1800","min":"","max":"","enum_allowed":[],"desc":"how frequently cephadm should check for the presence of stray daemons","long_desc":"","tags":[],"see_also":[]},"use_agent":{"name":"use_agent","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use cephadm agent on each host to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"use_repo_digest":{"name":"use_repo_digest","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Automatically convert image tags to image digest. Make sure all daemons use the same image","long_desc":"","tags":[],"see_also":[]},"warn_on_failed_host_check":{"name":"warn_on_failed_host_check","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if the host check fails","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_daemons":{"name":"warn_on_stray_daemons","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected that are not managed by cephadm","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_hosts":{"name":"warn_on_stray_hosts","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected on a host that is not managed by cephadm","long_desc":"","tags":[],"see_also":[]}}},{"name":"crash","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"retain_interval":{"name":"retain_interval","type":"secs","level":"advanced","flags":1,"default_value":"31536000","min":"","max":"","enum_allowed":[],"desc":"how long to retain crashes before pruning them","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"warn_recent_interval":{"name":"warn_recent_interval","type":"secs","level":"advanced","flags":1,"default_value":"1209600","min":"","max":"","enum_allowed":[],"desc":"time interval in which to warn about recent crashes","long_desc":"","tags":[],"see_also":[]}}},{"name":"dashboard","can_run":true,"error_string":"","module_options":{"ACCOUNT_LOCKOUT_ATTEMPTS":{"name":"ACCOUNT_LOCKOUT_ATTEMPTS","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_HOST":{"name":"ALERTMANAGER_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_SSL_VERIFY":{"name":"ALERTMANAGER_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_ENABLED":{"name":"AUDIT_API_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_LOG_PAYLOAD":{"name":"AUDIT_API_LOG_PAYLOAD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ENABLE_BROWSABLE_API":{"name":"ENABLE_BROWSABLE_API","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_CEPHFS":{"name":"FEATURE_TOGGLE_CEPHFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_DASHBOARD":{"name":"FEATURE_TOGGLE_DASHBOARD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_ISCSI":{"name":"FEATURE_TOGGLE_ISCSI","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_MIRRORING":{"name":"FEATURE_TOGGLE_MIRRORING","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_NFS":{"name":"FEATURE_TOGGLE_NFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RBD":{"name":"FEATURE_TOGGLE_RBD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RGW":{"name":"FEATURE_TOGGLE_RGW","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE":{"name":"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_PASSWORD":{"name":"GRAFANA_API_PASSWORD","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_SSL_VERIFY":{"name":"GRAFANA_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_URL":{"name":"GRAFANA_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_USERNAME":{"name":"GRAFANA_API_USERNAME","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_FRONTEND_API_URL":{"name":"GRAFANA_FRONTEND_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_UPDATE_DASHBOARDS":{"name":"GRAFANA_UPDATE_DASHBOARDS","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISCSI_API_SSL_VERIFICATION":{"name":"ISCSI_API_SSL_VERIFICATION","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISSUE_TRACKER_API_KEY":{"name":"ISSUE_TRACKER_API_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"MANAGED_BY_CLUSTERS":{"name":"MANAGED_BY_CLUSTERS","type":"str","level":"advanced","flags":0,"default_value":"[]","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"MULTICLUSTER_CONFIG":{"name":"MULTICLUSTER_CONFIG","type":"str","level":"advanced","flags":0,"default_value":"{}","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_HOST":{"name":"PROMETHEUS_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_SSL_VERIFY":{"name":"PROMETHEUS_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROM_ALERT_CREDENTIAL_CACHE_TTL":{"name":"PROM_ALERT_CREDENTIAL_CACHE_TTL","type":"int","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_COMPLEXITY_ENABLED":{"name":"PWD_POLICY_CHECK_COMPLEXITY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED":{"name":"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_LENGTH_ENABLED":{"name":"PWD_POLICY_CHECK_LENGTH_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_OLDPWD_ENABLED":{"name":"PWD_POLICY_CHECK_OLDPWD_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_USERNAME_ENABLED":{"name":"PWD_POLICY_CHECK_USERNAME_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_ENABLED":{"name":"PWD_POLICY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_EXCLUSION_LIST":{"name":"PWD_POLICY_EXCLUSION_LIST","type":"str","level":"advanced","flags":0,"default_value":"osd,host,dashboard,pool,block,nfs,ceph,monitors,gateway,logs,crush,maps","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_COMPLEXITY":{"name":"PWD_POLICY_MIN_COMPLEXITY","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_LENGTH":{"name":"PWD_POLICY_MIN_LENGTH","type":"int","level":"advanced","flags":0,"default_value":"8","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"REST_REQUESTS_TIMEOUT":{"name":"REST_REQUESTS_TIMEOUT","type":"int","level":"advanced","flags":0,"default_value":"45","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ACCESS_KEY":{"name":"RGW_API_ACCESS_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ADMIN_RESOURCE":{"name":"RGW_API_ADMIN_RESOURCE","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SECRET_KEY":{"name":"RGW_API_SECRET_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SSL_VERIFY":{"name":"RGW_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_HOSTNAME_PER_DAEMON":{"name":"RGW_HOSTNAME_PER_DAEMON","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"UNSAFE_TLS_v1_2":{"name":"UNSAFE_TLS_v1_2","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_SPAN":{"name":"USER_PWD_EXPIRATION_SPAN","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_1":{"name":"USER_PWD_EXPIRATION_WARNING_1","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_2":{"name":"USER_PWD_EXPIRATION_WARNING_2","type":"int","level":"advanced","flags":0,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"cross_origin_url":{"name":"cross_origin_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"crt_file":{"name":"crt_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"crypto_caller":{"name":"crypto_caller","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"debug":{"name":"debug","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable/disable debug options","long_desc":"","tags":[],"see_also":[]},"jwt_token_ttl":{"name":"jwt_token_ttl","type":"int","level":"advanced","flags":0,"default_value":"28800","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"key_file":{"name":"key_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"motd":{"name":"motd","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"The message of the day","long_desc":"","tags":[],"see_also":[]},"redirect_resolve_ip_addr":{"name":"redirect_resolve_ip_addr","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":0,"default_value":"8080","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl_server_port":{"name":"ssl_server_port","type":"int","level":"advanced","flags":0,"default_value":"8443","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sso_oauth2":{"name":"sso_oauth2","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":0,"default_value":"redirect","min":"","max":"","enum_allowed":["error","redirect"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":0,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url_prefix":{"name":"url_prefix","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"devicehealth","can_run":true,"error_string":"","module_options":{"enable_monitoring":{"name":"enable_monitoring","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"monitor device health metrics","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mark_out_threshold":{"name":"mark_out_threshold","type":"secs","level":"advanced","flags":1,"default_value":"2419200","min":"","max":"","enum_allowed":[],"desc":"automatically mark OSD if it may fail before this long","long_desc":"","tags":[],"see_also":[]},"pool_name":{"name":"pool_name","type":"str","level":"advanced","flags":1,"default_value":"device_health_metrics","min":"","max":"","enum_allowed":[],"desc":"name of pool in which to store device health metrics","long_desc":"","tags":[],"see_also":[]},"retention_period":{"name":"retention_period","type":"secs","level":"advanced","flags":1,"default_value":"15552000","min":"","max":"","enum_allowed":[],"desc":"how long to retain device health metrics","long_desc":"","tags":[],"see_also":[]},"scrape_frequency":{"name":"scrape_frequency","type":"secs","level":"advanced","flags":1,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"how frequently to scrape device health metrics","long_desc":"","tags":[],"see_also":[]},"self_heal":{"name":"self_heal","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"preemptively heal cluster around devices that may fail","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and check device health","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"warn_threshold":{"name":"warn_threshold","type":"secs","level":"advanced","flags":1,"default_value":"7257600","min":"","max":"","enum_allowed":[],"desc":"raise health warning if OSD may fail before this long","long_desc":"","tags":[],"see_also":[]}}},{"name":"diskprediction_local","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predict_interval":{"name":"predict_interval","type":"str","level":"advanced","flags":0,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predictor_model":{"name":"predictor_model","type":"str","level":"advanced","flags":0,"default_value":"prophetstor","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"str","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"influx","can_run":false,"error_string":"influxdb python module not found","module_options":{"batch_size":{"name":"batch_size","type":"int","level":"advanced","flags":0,"default_value":"5000","min":"","max":"","enum_allowed":[],"desc":"How big batches of data points should be when sending to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"database":{"name":"database","type":"str","level":"advanced","flags":0,"default_value":"ceph","min":"","max":"","enum_allowed":[],"desc":"InfluxDB database name. You will need to create this database and grant write privileges to the configured username or the username must have admin privileges to create it.","long_desc":"","tags":[],"see_also":[]},"hostname":{"name":"hostname","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server hostname","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"30","min":"5","max":"","enum_allowed":[],"desc":"Time between reports to InfluxDB. Default 30 seconds.","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"password":{"name":"password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"password of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"port":{"name":"port","type":"int","level":"advanced","flags":0,"default_value":"8086","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server port","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"str","level":"advanced","flags":0,"default_value":"false","min":"","max":"","enum_allowed":[],"desc":"Use https connection for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]},"threads":{"name":"threads","type":"int","level":"advanced","flags":0,"default_value":"5","min":"1","max":"32","enum_allowed":[],"desc":"How many worker threads should be spawned for sending data to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"username":{"name":"username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"username of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"verify_ssl":{"name":"verify_ssl","type":"str","level":"advanced","flags":0,"default_value":"true","min":"","max":"","enum_allowed":[],"desc":"Verify https cert for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]}}},{"name":"insights","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"iostat","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"localpool","can_run":true,"error_string":"","module_options":{"failure_domain":{"name":"failure_domain","type":"str","level":"advanced","flags":1,"default_value":"host","min":"","max":"","enum_allowed":[],"desc":"failure domain for any created local pool","long_desc":"what failure domain we should separate data replicas across.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_size":{"name":"min_size","type":"int","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"default min_size for any created local pool","long_desc":"value to set min_size to (unchanged from Ceph's default if this option is not set)","tags":[],"see_also":[]},"num_rep":{"name":"num_rep","type":"int","level":"advanced","flags":1,"default_value":"3","min":"","max":"","enum_allowed":[],"desc":"default replica count for any created local pool","long_desc":"","tags":[],"see_also":[]},"pg_num":{"name":"pg_num","type":"int","level":"advanced","flags":1,"default_value":"128","min":"","max":"","enum_allowed":[],"desc":"default pg_num for any created local pool","long_desc":"","tags":[],"see_also":[]},"prefix":{"name":"prefix","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"name prefix for any created local pool","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"subtree":{"name":"subtree","type":"str","level":"advanced","flags":1,"default_value":"rack","min":"","max":"","enum_allowed":[],"desc":"CRUSH level for which to create a local pool","long_desc":"which CRUSH subtree type the module should create a pool for.","tags":[],"see_also":[]}}},{"name":"mds_autoscaler","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"mirroring","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"nfs","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"orchestrator","can_run":true,"error_string":"","module_options":{"fail_fs":{"name":"fail_fs","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Fail filesystem for rapid multi-rank mds upgrade","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"orchestrator":{"name":"orchestrator","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["cephadm","rook","test_orchestrator"],"desc":"Orchestrator backend","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_perf_query","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"pg_autoscaler","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"threshold":{"name":"threshold","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"1.0","max":"","enum_allowed":[],"desc":"scaling threshold","long_desc":"The factor by which the `NEW PG_NUM` must vary from the current`PG_NUM` before being accepted. Cannot be less than 1.0","tags":[],"see_also":[]}}},{"name":"progress","can_run":true,"error_string":"","module_options":{"allow_pg_recovery_event":{"name":"allow_pg_recovery_event","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow the module to show pg recovery progress","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_completed_events":{"name":"max_completed_events","type":"int","level":"advanced","flags":1,"default_value":"50","min":"","max":"","enum_allowed":[],"desc":"number of past completed events to remember","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"how long the module is going to sleep","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"prometheus","can_run":true,"error_string":"","module_options":{"cache":{"name":"cache","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"exclude_perf_counters":{"name":"exclude_perf_counters","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Do not include perf-counters in the metrics output","long_desc":"Gathering perf-counters from a single Prometheus exporter can degrade ceph-mgr performance, especially in large clusters. Instead, Ceph-exporter daemons are now used by default for perf-counter gathering. This should only be disabled when no ceph-exporters are deployed.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools":{"name":"rbd_stats_pools","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools_refresh_interval":{"name":"rbd_stats_pools_refresh_interval","type":"int","level":"advanced","flags":0,"default_value":"300","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"scrape_interval":{"name":"scrape_interval","type":"float","level":"advanced","flags":0,"default_value":"15.0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"the IPv4 or IPv6 address on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":1,"default_value":"9283","min":"","max":"","enum_allowed":[],"desc":"the port on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"stale_cache_strategy":{"name":"stale_cache_strategy","type":"str","level":"advanced","flags":0,"default_value":"log","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":1,"default_value":"default","min":"","max":"","enum_allowed":["default","error"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":1,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rbd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_snap_create":{"name":"max_concurrent_snap_create","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mirror_snapshot_schedule":{"name":"mirror_snapshot_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"trash_purge_schedule":{"name":"trash_purge_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rgw","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"secondary_zone_period_retry_limit":{"name":"secondary_zone_period_retry_limit","type":"int","level":"advanced","flags":0,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"RGW module period update retry limit for secondary site","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rook","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"prometheus_tls_secret_name":{"name":"prometheus_tls_secret_name","type":"str","level":"advanced","flags":0,"default_value":"rook-ceph-prometheus-server-tls","min":"","max":"","enum_allowed":[],"desc":"name of tls secret in k8s for prometheus","long_desc":"","tags":[],"see_also":[]},"secure_monitoring_stack":{"name":"secure_monitoring_stack","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable TLS security for all the monitoring stack daemons","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"storage_class":{"name":"storage_class","type":"str","level":"advanced","flags":0,"default_value":"local","min":"","max":"","enum_allowed":[],"desc":"storage class name for LSO-discovered PVs","long_desc":"","tags":[],"see_also":[]}}},{"name":"selftest","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption1":{"name":"roption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption2":{"name":"roption2","type":"str","level":"advanced","flags":0,"default_value":"xyz","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption1":{"name":"rwoption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption2":{"name":"rwoption2","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption3":{"name":"rwoption3","type":"float","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption4":{"name":"rwoption4","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption5":{"name":"rwoption5","type":"bool","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption6":{"name":"rwoption6","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption7":{"name":"rwoption7","type":"int","level":"advanced","flags":0,"default_value":"","min":"1","max":"42","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testkey":{"name":"testkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testlkey":{"name":"testlkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testnewline":{"name":"testnewline","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"smb","can_run":true,"error_string":"","module_options":{"internal_store_backend":{"name":"internal_store_backend","type":"str","level":"dev","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"set internal store backend. for develoment and testing only","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"update_orchestration":{"name":"update_orchestration","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"automatically update orchestration when smb resources are changed","long_desc":"","tags":[],"see_also":[]}}},{"name":"snap_schedule","can_run":true,"error_string":"","module_options":{"allow_m_granularity":{"name":"allow_m_granularity","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow minute scheduled snapshots","long_desc":"","tags":[],"see_also":[]},"dump_on_update":{"name":"dump_on_update","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"dump database to debug log on update","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"stats","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"status","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telegraf","can_run":true,"error_string":"","module_options":{"address":{"name":"address","type":"str","level":"advanced","flags":0,"default_value":"unixgram:///tmp/telegraf.sock","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"15","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telemetry","can_run":true,"error_string":"","module_options":{"channel_basic":{"name":"channel_basic","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share basic cluster information (size, version)","long_desc":"","tags":[],"see_also":[]},"channel_crash":{"name":"channel_crash","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share metadata about Ceph daemon crashes (version, stack straces, etc)","long_desc":"","tags":[],"see_also":[]},"channel_device":{"name":"channel_device","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share device health metrics (e.g., SMART data, minus potentially identifying info like serial numbers)","long_desc":"","tags":[],"see_also":[]},"channel_ident":{"name":"channel_ident","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share a user-provided description and/or contact email for the cluster","long_desc":"","tags":[],"see_also":[]},"channel_perf":{"name":"channel_perf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share various performance metrics of a cluster","long_desc":"","tags":[],"see_also":[]},"contact":{"name":"contact","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"description":{"name":"description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"device_url":{"name":"device_url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/device","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"int","level":"advanced","flags":0,"default_value":"24","min":"8","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"last_opt_revision":{"name":"last_opt_revision","type":"int","level":"advanced","flags":0,"default_value":"1","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard":{"name":"leaderboard","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard_description":{"name":"leaderboard_description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"organization":{"name":"organization","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"proxy":{"name":"proxy","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url":{"name":"url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/report","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"test_orchestrator","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"volumes","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_clones":{"name":"max_concurrent_clones","type":"int","level":"advanced","flags":0,"default_value":"4","min":"","max":"","enum_allowed":[],"desc":"Number of asynchronous cloner threads","long_desc":"","tags":[],"see_also":[]},"pause_cloning":{"name":"pause_cloning","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Pause asynchronous cloner threads","long_desc":"","tags":[],"see_also":[]},"pause_purging":{"name":"pause_purging","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Pause asynchronous subvolume purge threads","long_desc":"","tags":[],"see_also":[]},"periodic_async_work":{"name":"periodic_async_work","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Periodically check for async work","long_desc":"","tags":[],"see_also":[]},"snapshot_clone_delay":{"name":"snapshot_clone_delay","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"Delay clone begin operation by snapshot_clone_delay seconds","long_desc":"","tags":[],"see_also":[]},"snapshot_clone_no_wait":{"name":"snapshot_clone_no_wait","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Reject subvolume clone request when cloner threads are busy","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}}],"services":{},"always_on_modules":{"octopus":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"pacific":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"quincy":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"reef":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"squid":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"tentacle":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"]},"force_disabled_modules":{},"last_failure_osd_epoch":0,"active_clients":[{"name":"devicehealth","addrvec":[{"type":"v2","addr":"192.168.123.104:0","nonce":1771300012}]},{"name":"libcephsqlite","addrvec":[{"type":"v2","addr":"192.168.123.104:0","nonce":3431459337}]},{"name":"rbd_support","addrvec":[{"type":"v2","addr":"192.168.123.104:0","nonce":3897111736}]},{"name":"volumes","addrvec":[{"type":"v2","addr":"192.168.123.104:0","nonce":187548524}]}]} 2026-03-25T15:27:59.467 INFO:tasks.ceph.ceph_manager.ceph:mgr available! 2026-03-25T15:27:59.467 INFO:tasks.ceph.ceph_manager.ceph:waiting for all up 2026-03-25T15:27:59.468 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd dump --format=json 2026-03-25T15:27:59.667 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:59.667 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":15,"fsid":"08196b8a-fd91-49b2-b8a6-e1d21f829086","created":"2026-03-25T15:27:46.739365+0000","modified":"2026-03-25T15:27:58.978658+0000","last_up_change":"2026-03-25T15:27:51.947587+0000","last_in_change":"2026-03-25T15:27:47.831101+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":5,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":2,"max_osd":3,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"tentacle","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-25T15:27:53.853201+0000","flags":1,"flags_names":"hashpspool","type":1,"size":2,"min_size":1,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"is_stretch_pool":false,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"12","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"nonprimary_shards":"{}","options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_type":"Fair distribution","score_acting":2.9900000095367432,"score_stable":2.9900000095367432,"optimal_score":0.67000001668930054,"raw_score_acting":2,"raw_score_stable":2,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}},{"pool":2,"pool_name":"rbd","create_time":"2026-03-25T15:27:55.461429+0000","flags":8193,"flags_names":"hashpspool,selfmanaged_snaps","type":1,"size":2,"min_size":1,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"is_stretch_pool":false,"object_hash":2,"pg_autoscale_mode":"off","pg_num":8,"pg_placement_num":8,"pg_placement_num_target":8,"pg_num_target":8,"pg_num_pending":8,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"15","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":2,"snap_epoch":15,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"nonprimary_shards":"{}","options":{},"application_metadata":{"rbd":{}},"read_balance":{"score_type":"Fair distribution","score_acting":1.8799999952316284,"score_stable":1.8799999952316284,"optimal_score":1,"raw_score_acting":1.8799999952316284,"raw_score_stable":1.8799999952316284,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"4f49ea48-ae3c-407d-a183-56efa44295aa","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":9,"up_thru":12,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6816","nonce":2768043570},{"type":"v1","addr":"192.168.123.104:6817","nonce":2768043570}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6818","nonce":2768043570},{"type":"v1","addr":"192.168.123.104:6819","nonce":2768043570}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6822","nonce":2768043570},{"type":"v1","addr":"192.168.123.104:6823","nonce":2768043570}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6820","nonce":2768043570},{"type":"v1","addr":"192.168.123.104:6821","nonce":2768043570}]},"public_addr":"192.168.123.104:6817/2768043570","cluster_addr":"192.168.123.104:6819/2768043570","heartbeat_back_addr":"192.168.123.104:6823/2768043570","heartbeat_front_addr":"192.168.123.104:6821/2768043570","state":["exists","up"]},{"osd":1,"uuid":"f2612533-a570-4836-a45a-2ae6235f25d4","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":9,"up_thru":12,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6808","nonce":2830815940},{"type":"v1","addr":"192.168.123.104:6809","nonce":2830815940}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6810","nonce":2830815940},{"type":"v1","addr":"192.168.123.104:6811","nonce":2830815940}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6814","nonce":2830815940},{"type":"v1","addr":"192.168.123.104:6815","nonce":2830815940}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6812","nonce":2830815940},{"type":"v1","addr":"192.168.123.104:6813","nonce":2830815940}]},"public_addr":"192.168.123.104:6809/2830815940","cluster_addr":"192.168.123.104:6811/2830815940","heartbeat_back_addr":"192.168.123.104:6815/2830815940","heartbeat_front_addr":"192.168.123.104:6813/2830815940","state":["exists","up"]},{"osd":2,"uuid":"e252769d-790d-4ead-9e7c-79a5105f87f1","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":12,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6800","nonce":2246359187},{"type":"v1","addr":"192.168.123.104:6801","nonce":2246359187}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6802","nonce":2246359187},{"type":"v1","addr":"192.168.123.104:6803","nonce":2246359187}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6806","nonce":2246359187},{"type":"v1","addr":"192.168.123.104:6807","nonce":2246359187}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6804","nonce":2246359187},{"type":"v1","addr":"192.168.123.104:6805","nonce":2246359187}]},"public_addr":"192.168.123.104:6801/2246359187","cluster_addr":"192.168.123.104:6803/2246359187","heartbeat_back_addr":"192.168.123.104:6807/2246359187","heartbeat_front_addr":"192.168.123.104:6805/2246359187","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"isa","technique":"reed_sol_van"}},"removed_snaps_queue":[{"pool":2,"snaps":[{"begin":2,"length":1}]}],"new_removed_snaps":[{"pool":2,"snaps":[{"begin":2,"length":1}]}],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-25T15:27:59.682 INFO:tasks.ceph.ceph_manager.ceph:all up! 2026-03-25T15:27:59.682 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd dump --format=json 2026-03-25T15:27:59.883 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:27:59.883 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":15,"fsid":"08196b8a-fd91-49b2-b8a6-e1d21f829086","created":"2026-03-25T15:27:46.739365+0000","modified":"2026-03-25T15:27:58.978658+0000","last_up_change":"2026-03-25T15:27:51.947587+0000","last_in_change":"2026-03-25T15:27:47.831101+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":5,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":2,"max_osd":3,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"tentacle","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-25T15:27:53.853201+0000","flags":1,"flags_names":"hashpspool","type":1,"size":2,"min_size":1,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"is_stretch_pool":false,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"12","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"nonprimary_shards":"{}","options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_type":"Fair distribution","score_acting":2.9900000095367432,"score_stable":2.9900000095367432,"optimal_score":0.67000001668930054,"raw_score_acting":2,"raw_score_stable":2,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}},{"pool":2,"pool_name":"rbd","create_time":"2026-03-25T15:27:55.461429+0000","flags":8193,"flags_names":"hashpspool,selfmanaged_snaps","type":1,"size":2,"min_size":1,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"is_stretch_pool":false,"object_hash":2,"pg_autoscale_mode":"off","pg_num":8,"pg_placement_num":8,"pg_placement_num_target":8,"pg_num_target":8,"pg_num_pending":8,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"15","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":2,"snap_epoch":15,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"nonprimary_shards":"{}","options":{},"application_metadata":{"rbd":{}},"read_balance":{"score_type":"Fair distribution","score_acting":1.8799999952316284,"score_stable":1.8799999952316284,"optimal_score":1,"raw_score_acting":1.8799999952316284,"raw_score_stable":1.8799999952316284,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"4f49ea48-ae3c-407d-a183-56efa44295aa","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":9,"up_thru":12,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6816","nonce":2768043570},{"type":"v1","addr":"192.168.123.104:6817","nonce":2768043570}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6818","nonce":2768043570},{"type":"v1","addr":"192.168.123.104:6819","nonce":2768043570}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6822","nonce":2768043570},{"type":"v1","addr":"192.168.123.104:6823","nonce":2768043570}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6820","nonce":2768043570},{"type":"v1","addr":"192.168.123.104:6821","nonce":2768043570}]},"public_addr":"192.168.123.104:6817/2768043570","cluster_addr":"192.168.123.104:6819/2768043570","heartbeat_back_addr":"192.168.123.104:6823/2768043570","heartbeat_front_addr":"192.168.123.104:6821/2768043570","state":["exists","up"]},{"osd":1,"uuid":"f2612533-a570-4836-a45a-2ae6235f25d4","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":9,"up_thru":12,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6808","nonce":2830815940},{"type":"v1","addr":"192.168.123.104:6809","nonce":2830815940}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6810","nonce":2830815940},{"type":"v1","addr":"192.168.123.104:6811","nonce":2830815940}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6814","nonce":2830815940},{"type":"v1","addr":"192.168.123.104:6815","nonce":2830815940}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6812","nonce":2830815940},{"type":"v1","addr":"192.168.123.104:6813","nonce":2830815940}]},"public_addr":"192.168.123.104:6809/2830815940","cluster_addr":"192.168.123.104:6811/2830815940","heartbeat_back_addr":"192.168.123.104:6815/2830815940","heartbeat_front_addr":"192.168.123.104:6813/2830815940","state":["exists","up"]},{"osd":2,"uuid":"e252769d-790d-4ead-9e7c-79a5105f87f1","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":12,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6800","nonce":2246359187},{"type":"v1","addr":"192.168.123.104:6801","nonce":2246359187}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6802","nonce":2246359187},{"type":"v1","addr":"192.168.123.104:6803","nonce":2246359187}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6806","nonce":2246359187},{"type":"v1","addr":"192.168.123.104:6807","nonce":2246359187}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6804","nonce":2246359187},{"type":"v1","addr":"192.168.123.104:6805","nonce":2246359187}]},"public_addr":"192.168.123.104:6801/2246359187","cluster_addr":"192.168.123.104:6803/2246359187","heartbeat_back_addr":"192.168.123.104:6807/2246359187","heartbeat_front_addr":"192.168.123.104:6805/2246359187","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"isa","technique":"reed_sol_van"}},"removed_snaps_queue":[{"pool":2,"snaps":[{"begin":2,"length":1}]}],"new_removed_snaps":[{"pool":2,"snaps":[{"begin":2,"length":1}]}],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-25T15:27:59.894 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph tell osd.0 flush_pg_stats 2026-03-25T15:27:59.894 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph tell osd.1 flush_pg_stats 2026-03-25T15:27:59.895 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph tell osd.2 flush_pg_stats 2026-03-25T15:28:00.003 INFO:teuthology.orchestra.run.vm04.stdout:38654705667 2026-03-25T15:28:00.003 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd last-stat-seq osd.0 2026-03-25T15:28:00.006 INFO:teuthology.orchestra.run.vm04.stdout:34359738371 2026-03-25T15:28:00.006 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd last-stat-seq osd.2 2026-03-25T15:28:00.009 INFO:teuthology.orchestra.run.vm04.stdout:38654705667 2026-03-25T15:28:00.009 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd last-stat-seq osd.1 2026-03-25T15:28:00.222 INFO:teuthology.orchestra.run.vm04.stdout:38654705666 2026-03-25T15:28:00.234 INFO:tasks.ceph.ceph_manager.ceph:need seq 38654705667 got 38654705666 for osd.0 2026-03-25T15:28:00.271 INFO:teuthology.orchestra.run.vm04.stdout:34359738370 2026-03-25T15:28:00.272 INFO:teuthology.orchestra.run.vm04.stdout:38654705666 2026-03-25T15:28:00.283 INFO:tasks.ceph.ceph_manager.ceph:need seq 34359738371 got 34359738370 for osd.2 2026-03-25T15:28:00.285 INFO:tasks.ceph.ceph_manager.ceph:need seq 38654705667 got 38654705666 for osd.1 2026-03-25T15:28:01.235 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd last-stat-seq osd.0 2026-03-25T15:28:01.284 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd last-stat-seq osd.2 2026-03-25T15:28:01.285 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd last-stat-seq osd.1 2026-03-25T15:28:01.446 INFO:teuthology.orchestra.run.vm04.stdout:38654705666 2026-03-25T15:28:01.463 INFO:tasks.ceph.ceph_manager.ceph:need seq 38654705667 got 38654705666 for osd.0 2026-03-25T15:28:01.497 INFO:teuthology.orchestra.run.vm04.stdout:34359738370 2026-03-25T15:28:01.509 INFO:tasks.ceph.ceph_manager.ceph:need seq 34359738371 got 34359738370 for osd.2 2026-03-25T15:28:01.512 INFO:teuthology.orchestra.run.vm04.stdout:38654705666 2026-03-25T15:28:01.523 INFO:tasks.ceph.ceph_manager.ceph:need seq 38654705667 got 38654705666 for osd.1 2026-03-25T15:28:02.464 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd last-stat-seq osd.0 2026-03-25T15:28:02.510 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd last-stat-seq osd.2 2026-03-25T15:28:02.524 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd last-stat-seq osd.1 2026-03-25T15:28:02.694 INFO:teuthology.orchestra.run.vm04.stdout:38654705667 2026-03-25T15:28:02.707 INFO:tasks.ceph.ceph_manager.ceph:need seq 38654705667 got 38654705667 for osd.0 2026-03-25T15:28:02.707 DEBUG:teuthology.parallel:result is None 2026-03-25T15:28:02.733 INFO:teuthology.orchestra.run.vm04.stdout:34359738371 2026-03-25T15:28:02.744 INFO:tasks.ceph.ceph_manager.ceph:need seq 34359738371 got 34359738371 for osd.2 2026-03-25T15:28:02.744 DEBUG:teuthology.parallel:result is None 2026-03-25T15:28:02.777 INFO:teuthology.orchestra.run.vm04.stdout:38654705667 2026-03-25T15:28:02.790 INFO:tasks.ceph.ceph_manager.ceph:need seq 38654705667 got 38654705667 for osd.1 2026-03-25T15:28:02.790 DEBUG:teuthology.parallel:result is None 2026-03-25T15:28:02.790 INFO:tasks.ceph.ceph_manager.ceph:waiting for clean 2026-03-25T15:28:02.790 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph pg dump --format=json 2026-03-25T15:28:03.055 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:28:03.056 INFO:teuthology.orchestra.run.vm04.stderr:dumped all 2026-03-25T15:28:03.067 INFO:teuthology.orchestra.run.vm04.stdout:{"pg_ready":true,"pg_map":{"version":17,"stamp":"2026-03-25T15:28:01.840737+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":590387,"num_objects":4,"num_object_clones":0,"num_object_copies":8,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":4,"num_whiteouts":0,"num_read":76,"num_read_kb":64,"num_write":125,"num_write_kb":2152,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":95,"ondisk_log_size":95,"up":18,"acting":18,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":18,"num_osds":3,"num_per_pool_osds":3,"num_per_pool_omap_osds":3,"kb":283115520,"kb_used":82104,"kb_used_data":1544,"kb_used_omap":21,"kb_used_meta":80426,"kb_avail":283033416,"statfs":{"total":289910292480,"available":289826217984,"internally_reserved":0,"allocated":1581056,"data_stored":1290207,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":22424,"internal_metadata":82356328},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":1,"apply_latency_ms":1,"commit_latency_ns":1000000,"apply_latency_ns":1000000},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":19,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"3.864109"},"pg_stats":[{"pgid":"2.7","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-25T15:27:58.982485+0000","last_change":"2026-03-25T15:27:58.982576+0000","last_active":"2026-03-25T15:27:58.982485+0000","last_peered":"2026-03-25T15:27:58.982485+0000","last_clean":"2026-03-25T15:27:58.982485+0000","last_became_active":"2026-03-25T15:27:56.978452+0000","last_became_peered":"2026-03-25T15:27:56.978452+0000","last_unstale":"2026-03-25T15:27:58.982485+0000","last_undegraded":"2026-03-25T15:27:58.982485+0000","last_fullsized":"2026-03-25T15:27:58.982485+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-27T02:07:21.204079+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.000218679,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.6","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-25T15:27:58.981712+0000","last_change":"2026-03-25T15:27:58.981926+0000","last_active":"2026-03-25T15:27:58.981712+0000","last_peered":"2026-03-25T15:27:58.981712+0000","last_clean":"2026-03-25T15:27:58.981712+0000","last_became_active":"2026-03-25T15:27:56.978427+0000","last_became_peered":"2026-03-25T15:27:56.978427+0000","last_unstale":"2026-03-25T15:27:58.981712+0000","last_undegraded":"2026-03-25T15:27:58.981712+0000","last_fullsized":"2026-03-25T15:27:58.981712+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T16:23:25.050769+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00030590199999999999,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.5","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-25T15:27:58.982451+0000","last_change":"2026-03-25T15:27:58.982520+0000","last_active":"2026-03-25T15:27:58.982451+0000","last_peered":"2026-03-25T15:27:58.982451+0000","last_clean":"2026-03-25T15:27:58.982451+0000","last_became_active":"2026-03-25T15:27:56.978544+0000","last_became_peered":"2026-03-25T15:27:56.978544+0000","last_unstale":"2026-03-25T15:27:58.982451+0000","last_undegraded":"2026-03-25T15:27:58.982451+0000","last_fullsized":"2026-03-25T15:27:58.982451+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T19:28:34.884821+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00048520900000000001,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.4","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-25T15:27:58.981773+0000","last_change":"2026-03-25T15:27:58.981970+0000","last_active":"2026-03-25T15:27:58.981773+0000","last_peered":"2026-03-25T15:27:58.981773+0000","last_clean":"2026-03-25T15:27:58.981773+0000","last_became_active":"2026-03-25T15:27:56.977898+0000","last_became_peered":"2026-03-25T15:27:56.977898+0000","last_unstale":"2026-03-25T15:27:58.981773+0000","last_undegraded":"2026-03-25T15:27:58.981773+0000","last_fullsized":"2026-03-25T15:27:58.981773+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T18:52:46.658653+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00024988799999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.2","version":"15'2","reported_seq":22,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-25T15:27:58.986008+0000","last_change":"2026-03-25T15:27:58.986008+0000","last_active":"2026-03-25T15:27:58.986008+0000","last_peered":"2026-03-25T15:27:58.986008+0000","last_clean":"2026-03-25T15:27:58.986008+0000","last_became_active":"2026-03-25T15:27:56.980723+0000","last_became_peered":"2026-03-25T15:27:56.980723+0000","last_unstale":"2026-03-25T15:27:58.986008+0000","last_undegraded":"2026-03-25T15:27:58.986008+0000","last_fullsized":"2026-03-25T15:27:58.986008+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":2,"log_dups_size":0,"ondisk_log_size":2,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T15:36:07.569759+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00061644300000000005,"stat_sum":{"num_bytes":19,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,1],"acting":[0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.1","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-25T15:27:59.727741+0000","last_change":"2026-03-25T15:27:59.727861+0000","last_active":"2026-03-25T15:27:59.727741+0000","last_peered":"2026-03-25T15:27:59.727741+0000","last_clean":"2026-03-25T15:27:59.727741+0000","last_became_active":"2026-03-25T15:27:56.977608+0000","last_became_peered":"2026-03-25T15:27:56.977608+0000","last_unstale":"2026-03-25T15:27:59.727741+0000","last_undegraded":"2026-03-25T15:27:59.727741+0000","last_fullsized":"2026-03-25T15:27:59.727741+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T17:09:52.152973+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00024846500000000001,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.0","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-25T15:27:59.727884+0000","last_change":"2026-03-25T15:27:59.727946+0000","last_active":"2026-03-25T15:27:59.727884+0000","last_peered":"2026-03-25T15:27:59.727884+0000","last_clean":"2026-03-25T15:27:59.727884+0000","last_became_active":"2026-03-25T15:27:56.977549+0000","last_became_peered":"2026-03-25T15:27:56.977549+0000","last_unstale":"2026-03-25T15:27:59.727884+0000","last_undegraded":"2026-03-25T15:27:59.727884+0000","last_fullsized":"2026-03-25T15:27:59.727884+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T19:53:37.381017+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00021015399999999999,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.3","version":"13'1","reported_seq":21,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-25T15:27:58.982017+0000","last_change":"2026-03-25T15:27:58.982116+0000","last_active":"2026-03-25T15:27:58.982017+0000","last_peered":"2026-03-25T15:27:58.982017+0000","last_clean":"2026-03-25T15:27:58.982017+0000","last_became_active":"2026-03-25T15:27:56.978893+0000","last_became_peered":"2026-03-25T15:27:56.978893+0000","last_unstale":"2026-03-25T15:27:58.982017+0000","last_undegraded":"2026-03-25T15:27:58.982017+0000","last_fullsized":"2026-03-25T15:27:58.982017+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":1,"log_dups_size":0,"ondisk_log_size":1,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-27T01:00:20.598828+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00033210999999999999,"stat_sum":{"num_bytes":0,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2],"acting":[1,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"1.0","version":"11'92","reported_seq":132,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-25T15:27:58.981824+0000","last_change":"2026-03-25T15:27:54.967550+0000","last_active":"2026-03-25T15:27:58.981824+0000","last_peered":"2026-03-25T15:27:58.981824+0000","last_clean":"2026-03-25T15:27:58.981824+0000","last_became_active":"2026-03-25T15:27:54.967418+0000","last_became_peered":"2026-03-25T15:27:54.967418+0000","last_unstale":"2026-03-25T15:27:58.981824+0000","last_undegraded":"2026-03-25T15:27:58.981824+0000","last_fullsized":"2026-03-25T15:27:58.981824+0000","mapping_epoch":10,"log_start":"0'0","ondisk_log_start":"0'0","created":10,"last_epoch_clean":11,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:53.958171+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:53.958171+0000","last_clean_scrub_stamp":"2026-03-25T15:27:53.958171+0000","objects_scrubbed":0,"log_size":92,"log_dups_size":0,"ondisk_log_size":92,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T21:15:15.083319+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":590368,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":76,"num_read_kb":64,"num_write":123,"num_write_kb":2150,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]}],"pool_stats":[{"poolid":2,"num_pg":8,"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":38,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":3,"ondisk_log_size":3,"up":16,"acting":16,"num_store_stats":3},{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":590368,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":76,"num_read_kb":64,"num_write":123,"num_write_kb":2150,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":1187840,"data_stored":1180736,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":92,"ondisk_log_size":92,"up":2,"acting":2,"num_store_stats":2}],"osd_stats":[{"osd":1,"up_from":9,"seq":38654705667,"num_pgs":9,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":27552,"kb_used_data":712,"kb_used_omap":7,"kb_used_meta":26808,"kb_avail":94344288,"statfs":{"total":96636764160,"available":96608550912,"internally_reserved":0,"allocated":729088,"data_stored":628050,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":8127,"internal_metadata":27451457},"hb_peers":[0,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]},{"osd":0,"up_from":9,"seq":38654705667,"num_pgs":6,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":27576,"kb_used_data":712,"kb_used_omap":7,"kb_used_meta":26808,"kb_avail":94344264,"statfs":{"total":96636764160,"available":96608526336,"internally_reserved":0,"allocated":729088,"data_stored":628050,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":8127,"internal_metadata":27451457},"hb_peers":[1,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":1,"apply_latency_ms":1,"commit_latency_ns":1000000,"apply_latency_ns":1000000},"alerts":[]},{"osd":2,"up_from":8,"seq":34359738371,"num_pgs":3,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":26976,"kb_used_data":120,"kb_used_omap":6,"kb_used_meta":26809,"kb_avail":94344864,"statfs":{"total":96636764160,"available":96609140736,"internally_reserved":0,"allocated":122880,"data_stored":34107,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":6170,"internal_metadata":27453414},"hb_peers":[0,1],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":593920,"data_stored":590368,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":593920,"data_stored":590368,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":4096,"data_stored":19,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":4096,"data_stored":19,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0}]}} 2026-03-25T15:28:03.067 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph pg dump --format=json 2026-03-25T15:28:03.291 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:28:03.291 INFO:teuthology.orchestra.run.vm04.stderr:dumped all 2026-03-25T15:28:03.304 INFO:teuthology.orchestra.run.vm04.stdout:{"pg_ready":true,"pg_map":{"version":17,"stamp":"2026-03-25T15:28:01.840737+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":590387,"num_objects":4,"num_object_clones":0,"num_object_copies":8,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":4,"num_whiteouts":0,"num_read":76,"num_read_kb":64,"num_write":125,"num_write_kb":2152,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":95,"ondisk_log_size":95,"up":18,"acting":18,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":18,"num_osds":3,"num_per_pool_osds":3,"num_per_pool_omap_osds":3,"kb":283115520,"kb_used":82104,"kb_used_data":1544,"kb_used_omap":21,"kb_used_meta":80426,"kb_avail":283033416,"statfs":{"total":289910292480,"available":289826217984,"internally_reserved":0,"allocated":1581056,"data_stored":1290207,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":22424,"internal_metadata":82356328},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":1,"apply_latency_ms":1,"commit_latency_ns":1000000,"apply_latency_ns":1000000},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":19,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"3.864109"},"pg_stats":[{"pgid":"2.7","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-25T15:27:58.982485+0000","last_change":"2026-03-25T15:27:58.982576+0000","last_active":"2026-03-25T15:27:58.982485+0000","last_peered":"2026-03-25T15:27:58.982485+0000","last_clean":"2026-03-25T15:27:58.982485+0000","last_became_active":"2026-03-25T15:27:56.978452+0000","last_became_peered":"2026-03-25T15:27:56.978452+0000","last_unstale":"2026-03-25T15:27:58.982485+0000","last_undegraded":"2026-03-25T15:27:58.982485+0000","last_fullsized":"2026-03-25T15:27:58.982485+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-27T02:07:21.204079+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.000218679,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.6","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-25T15:27:58.981712+0000","last_change":"2026-03-25T15:27:58.981926+0000","last_active":"2026-03-25T15:27:58.981712+0000","last_peered":"2026-03-25T15:27:58.981712+0000","last_clean":"2026-03-25T15:27:58.981712+0000","last_became_active":"2026-03-25T15:27:56.978427+0000","last_became_peered":"2026-03-25T15:27:56.978427+0000","last_unstale":"2026-03-25T15:27:58.981712+0000","last_undegraded":"2026-03-25T15:27:58.981712+0000","last_fullsized":"2026-03-25T15:27:58.981712+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T16:23:25.050769+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00030590199999999999,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.5","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-25T15:27:58.982451+0000","last_change":"2026-03-25T15:27:58.982520+0000","last_active":"2026-03-25T15:27:58.982451+0000","last_peered":"2026-03-25T15:27:58.982451+0000","last_clean":"2026-03-25T15:27:58.982451+0000","last_became_active":"2026-03-25T15:27:56.978544+0000","last_became_peered":"2026-03-25T15:27:56.978544+0000","last_unstale":"2026-03-25T15:27:58.982451+0000","last_undegraded":"2026-03-25T15:27:58.982451+0000","last_fullsized":"2026-03-25T15:27:58.982451+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T19:28:34.884821+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00048520900000000001,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.4","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-25T15:27:58.981773+0000","last_change":"2026-03-25T15:27:58.981970+0000","last_active":"2026-03-25T15:27:58.981773+0000","last_peered":"2026-03-25T15:27:58.981773+0000","last_clean":"2026-03-25T15:27:58.981773+0000","last_became_active":"2026-03-25T15:27:56.977898+0000","last_became_peered":"2026-03-25T15:27:56.977898+0000","last_unstale":"2026-03-25T15:27:58.981773+0000","last_undegraded":"2026-03-25T15:27:58.981773+0000","last_fullsized":"2026-03-25T15:27:58.981773+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T18:52:46.658653+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00024988799999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.2","version":"15'2","reported_seq":22,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-25T15:27:58.986008+0000","last_change":"2026-03-25T15:27:58.986008+0000","last_active":"2026-03-25T15:27:58.986008+0000","last_peered":"2026-03-25T15:27:58.986008+0000","last_clean":"2026-03-25T15:27:58.986008+0000","last_became_active":"2026-03-25T15:27:56.980723+0000","last_became_peered":"2026-03-25T15:27:56.980723+0000","last_unstale":"2026-03-25T15:27:58.986008+0000","last_undegraded":"2026-03-25T15:27:58.986008+0000","last_fullsized":"2026-03-25T15:27:58.986008+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":2,"log_dups_size":0,"ondisk_log_size":2,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T15:36:07.569759+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00061644300000000005,"stat_sum":{"num_bytes":19,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,1],"acting":[0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.1","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-25T15:27:59.727741+0000","last_change":"2026-03-25T15:27:59.727861+0000","last_active":"2026-03-25T15:27:59.727741+0000","last_peered":"2026-03-25T15:27:59.727741+0000","last_clean":"2026-03-25T15:27:59.727741+0000","last_became_active":"2026-03-25T15:27:56.977608+0000","last_became_peered":"2026-03-25T15:27:56.977608+0000","last_unstale":"2026-03-25T15:27:59.727741+0000","last_undegraded":"2026-03-25T15:27:59.727741+0000","last_fullsized":"2026-03-25T15:27:59.727741+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T17:09:52.152973+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00024846500000000001,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.0","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-25T15:27:59.727884+0000","last_change":"2026-03-25T15:27:59.727946+0000","last_active":"2026-03-25T15:27:59.727884+0000","last_peered":"2026-03-25T15:27:59.727884+0000","last_clean":"2026-03-25T15:27:59.727884+0000","last_became_active":"2026-03-25T15:27:56.977549+0000","last_became_peered":"2026-03-25T15:27:56.977549+0000","last_unstale":"2026-03-25T15:27:59.727884+0000","last_undegraded":"2026-03-25T15:27:59.727884+0000","last_fullsized":"2026-03-25T15:27:59.727884+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T19:53:37.381017+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00021015399999999999,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.3","version":"13'1","reported_seq":21,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-25T15:27:58.982017+0000","last_change":"2026-03-25T15:27:58.982116+0000","last_active":"2026-03-25T15:27:58.982017+0000","last_peered":"2026-03-25T15:27:58.982017+0000","last_clean":"2026-03-25T15:27:58.982017+0000","last_became_active":"2026-03-25T15:27:56.978893+0000","last_became_peered":"2026-03-25T15:27:56.978893+0000","last_unstale":"2026-03-25T15:27:58.982017+0000","last_undegraded":"2026-03-25T15:27:58.982017+0000","last_fullsized":"2026-03-25T15:27:58.982017+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":1,"log_dups_size":0,"ondisk_log_size":1,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-27T01:00:20.598828+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00033210999999999999,"stat_sum":{"num_bytes":0,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2],"acting":[1,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"1.0","version":"11'92","reported_seq":132,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-25T15:27:58.981824+0000","last_change":"2026-03-25T15:27:54.967550+0000","last_active":"2026-03-25T15:27:58.981824+0000","last_peered":"2026-03-25T15:27:58.981824+0000","last_clean":"2026-03-25T15:27:58.981824+0000","last_became_active":"2026-03-25T15:27:54.967418+0000","last_became_peered":"2026-03-25T15:27:54.967418+0000","last_unstale":"2026-03-25T15:27:58.981824+0000","last_undegraded":"2026-03-25T15:27:58.981824+0000","last_fullsized":"2026-03-25T15:27:58.981824+0000","mapping_epoch":10,"log_start":"0'0","ondisk_log_start":"0'0","created":10,"last_epoch_clean":11,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:53.958171+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:53.958171+0000","last_clean_scrub_stamp":"2026-03-25T15:27:53.958171+0000","objects_scrubbed":0,"log_size":92,"log_dups_size":0,"ondisk_log_size":92,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T21:15:15.083319+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":590368,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":76,"num_read_kb":64,"num_write":123,"num_write_kb":2150,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]}],"pool_stats":[{"poolid":2,"num_pg":8,"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":38,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":3,"ondisk_log_size":3,"up":16,"acting":16,"num_store_stats":3},{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":590368,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":76,"num_read_kb":64,"num_write":123,"num_write_kb":2150,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":1187840,"data_stored":1180736,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":92,"ondisk_log_size":92,"up":2,"acting":2,"num_store_stats":2}],"osd_stats":[{"osd":1,"up_from":9,"seq":38654705667,"num_pgs":9,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":27552,"kb_used_data":712,"kb_used_omap":7,"kb_used_meta":26808,"kb_avail":94344288,"statfs":{"total":96636764160,"available":96608550912,"internally_reserved":0,"allocated":729088,"data_stored":628050,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":8127,"internal_metadata":27451457},"hb_peers":[0,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]},{"osd":0,"up_from":9,"seq":38654705667,"num_pgs":6,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":27576,"kb_used_data":712,"kb_used_omap":7,"kb_used_meta":26808,"kb_avail":94344264,"statfs":{"total":96636764160,"available":96608526336,"internally_reserved":0,"allocated":729088,"data_stored":628050,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":8127,"internal_metadata":27451457},"hb_peers":[1,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":1,"apply_latency_ms":1,"commit_latency_ns":1000000,"apply_latency_ns":1000000},"alerts":[]},{"osd":2,"up_from":8,"seq":34359738371,"num_pgs":3,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":26976,"kb_used_data":120,"kb_used_omap":6,"kb_used_meta":26809,"kb_avail":94344864,"statfs":{"total":96636764160,"available":96609140736,"internally_reserved":0,"allocated":122880,"data_stored":34107,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":6170,"internal_metadata":27453414},"hb_peers":[0,1],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":593920,"data_stored":590368,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":593920,"data_stored":590368,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":4096,"data_stored":19,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":4096,"data_stored":19,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0}]}} 2026-03-25T15:28:03.304 INFO:tasks.ceph.ceph_manager.ceph:clean! 2026-03-25T15:28:03.304 INFO:tasks.ceph:Waiting until ceph cluster ceph is healthy... 2026-03-25T15:28:03.305 INFO:tasks.ceph.ceph_manager.ceph:wait_until_healthy 2026-03-25T15:28:03.305 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph health --format=json 2026-03-25T15:28:03.539 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:28:03.539 INFO:teuthology.orchestra.run.vm04.stdout:{"status":"HEALTH_OK","checks":{},"mutes":[]} 2026-03-25T15:28:03.552 INFO:tasks.ceph.ceph_manager.ceph:wait_until_healthy done 2026-03-25T15:28:03.552 INFO:teuthology.run_tasks:Running task workunit... 2026-03-25T15:28:03.556 INFO:tasks.workunit:Pulling workunits from ref 0392f78529848ec72469e8e431875cb98d3a5fb4 2026-03-25T15:28:03.556 INFO:tasks.workunit:Making a separate scratch dir for every client... 2026-03-25T15:28:03.556 DEBUG:teuthology.orchestra.run.vm04:> stat -- /home/ubuntu/cephtest/mnt.0 2026-03-25T15:28:03.571 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-25T15:28:03.571 INFO:teuthology.orchestra.run.vm04.stderr:stat: cannot statx '/home/ubuntu/cephtest/mnt.0': No such file or directory 2026-03-25T15:28:03.571 DEBUG:teuthology.orchestra.run.vm04:> mkdir -- /home/ubuntu/cephtest/mnt.0 2026-03-25T15:28:03.628 INFO:tasks.workunit:Created dir /home/ubuntu/cephtest/mnt.0 2026-03-25T15:28:03.629 DEBUG:teuthology.orchestra.run.vm04:> cd -- /home/ubuntu/cephtest/mnt.0 && mkdir -- client.0 2026-03-25T15:28:03.684 INFO:tasks.workunit:timeout=3h 2026-03-25T15:28:03.684 INFO:tasks.workunit:cleanup=True 2026-03-25T15:28:03.684 DEBUG:teuthology.orchestra.run.vm04:> rm -rf /home/ubuntu/cephtest/clone.client.0 && git clone https://github.com/kshtsk/ceph.git /home/ubuntu/cephtest/clone.client.0 && cd /home/ubuntu/cephtest/clone.client.0 && git checkout 0392f78529848ec72469e8e431875cb98d3a5fb4 2026-03-25T15:28:03.742 INFO:tasks.workunit.client.0.vm04.stderr:Cloning into '/home/ubuntu/cephtest/clone.client.0'... 2026-03-25T15:28:42.622 INFO:tasks.workunit.client.0.vm04.stderr:Updating files: 84% (11794/13988) Updating files: 85% (11890/13988) Updating files: 86% (12030/13988) Updating files: 87% (12170/13988) Updating files: 88% (12310/13988) Updating files: 89% (12450/13988) Updating files: 90% (12590/13988) Updating files: 91% (12730/13988) Updating files: 92% (12869/13988) Updating files: 93% (13009/13988) Updating files: 94% (13149/13988) Updating files: 95% (13289/13988) Updating files: 96% (13429/13988) Updating files: 97% (13569/13988) Updating files: 98% (13709/13988) Updating files: 99% (13849/13988) Updating files: 100% (13988/13988) Updating files: 100% (13988/13988), done. 2026-03-25T15:28:43.379 INFO:tasks.workunit.client.0.vm04.stderr:Note: switching to '0392f78529848ec72469e8e431875cb98d3a5fb4'. 2026-03-25T15:28:43.379 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-25T15:28:43.379 INFO:tasks.workunit.client.0.vm04.stderr:You are in 'detached HEAD' state. You can look around, make experimental 2026-03-25T15:28:43.379 INFO:tasks.workunit.client.0.vm04.stderr:changes and commit them, and you can discard any commits you make in this 2026-03-25T15:28:43.379 INFO:tasks.workunit.client.0.vm04.stderr:state without impacting any branches by switching back to a branch. 2026-03-25T15:28:43.379 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-25T15:28:43.379 INFO:tasks.workunit.client.0.vm04.stderr:If you want to create a new branch to retain commits you create, you may 2026-03-25T15:28:43.379 INFO:tasks.workunit.client.0.vm04.stderr:do so (now or later) by using -c with the switch command. Example: 2026-03-25T15:28:43.379 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-25T15:28:43.379 INFO:tasks.workunit.client.0.vm04.stderr: git switch -c 2026-03-25T15:28:43.379 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-25T15:28:43.379 INFO:tasks.workunit.client.0.vm04.stderr:Or undo this operation with: 2026-03-25T15:28:43.379 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-25T15:28:43.379 INFO:tasks.workunit.client.0.vm04.stderr: git switch - 2026-03-25T15:28:43.379 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-25T15:28:43.379 INFO:tasks.workunit.client.0.vm04.stderr:Turn off this advice by setting config variable advice.detachedHead to false 2026-03-25T15:28:43.379 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-25T15:28:43.379 INFO:tasks.workunit.client.0.vm04.stderr:HEAD is now at 0392f785298 qa/tasks/keystone: restart mariadb for rocky and alma linux too 2026-03-25T15:28:43.385 DEBUG:teuthology.orchestra.run.vm04:> cd -- /home/ubuntu/cephtest/clone.client.0/qa/workunits && if test -e Makefile ; then make ; fi && find -executable -type f -printf '%P\0' >/home/ubuntu/cephtest/workunits.list.client.0 2026-03-25T15:28:43.444 INFO:tasks.workunit.client.0.vm04.stdout:for d in direct_io fs ; do ( cd $d ; make all ) ; done 2026-03-25T15:28:43.446 INFO:tasks.workunit.client.0.vm04.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/direct_io' 2026-03-25T15:28:43.446 INFO:tasks.workunit.client.0.vm04.stdout:cc -Wall -Wextra -D_GNU_SOURCE direct_io_test.c -o direct_io_test 2026-03-25T15:28:43.505 INFO:tasks.workunit.client.0.vm04.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_sync_io.c -o test_sync_io 2026-03-25T15:28:43.544 INFO:tasks.workunit.client.0.vm04.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_short_dio_read.c -o test_short_dio_read 2026-03-25T15:28:43.579 INFO:tasks.workunit.client.0.vm04.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/direct_io' 2026-03-25T15:28:43.581 INFO:tasks.workunit.client.0.vm04.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/fs' 2026-03-25T15:28:43.581 INFO:tasks.workunit.client.0.vm04.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_o_trunc.c -o test_o_trunc 2026-03-25T15:28:43.614 INFO:tasks.workunit.client.0.vm04.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/fs' 2026-03-25T15:28:43.618 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-25T15:28:43.618 DEBUG:teuthology.orchestra.run.vm04:> dd if=/home/ubuntu/cephtest/workunits.list.client.0 of=/dev/stdout 2026-03-25T15:28:43.676 INFO:tasks.workunit:Running workunits matching rbd/cli_generic.sh on client.0... 2026-03-25T15:28:43.677 INFO:tasks.workunit:Running workunit rbd/cli_generic.sh... 2026-03-25T15:28:43.677 DEBUG:teuthology.orchestra.run.vm04:workunit test rbd/cli_generic.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=0392f78529848ec72469e8e431875cb98d3a5fb4 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/rbd/cli_generic.sh 2026-03-25T15:28:43.738 INFO:tasks.workunit.client.0.vm04.stderr:+ export RBD_FORCE_ALLOW_V1=1 2026-03-25T15:28:43.738 INFO:tasks.workunit.client.0.vm04.stderr:+ RBD_FORCE_ALLOW_V1=1 2026-03-25T15:28:43.738 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-25T15:28:43.738 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:28:43.738 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -v '^0$' 2026-03-25T15:28:43.769 INFO:tasks.workunit.client.0.vm04.stderr:+ IMGS='testimg1 testimg2 testimg3 testimg4 testimg5 testimg6 testimg-diff1 testimg-diff2 testimg-diff3 foo foo2 bar bar2 test1 test2 test3 test4 clone2' 2026-03-25T15:28:43.769 INFO:tasks.workunit.client.0.vm04.stderr:+ tiered=0 2026-03-25T15:28:43.769 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd dump 2026-03-25T15:28:43.770 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^pool' 2026-03-25T15:28:43.770 INFO:tasks.workunit.client.0.vm04.stderr:+ grep ''\''rbd'\''' 2026-03-25T15:28:43.770 INFO:tasks.workunit.client.0.vm04.stderr:+ grep tier 2026-03-25T15:28:44.057 INFO:tasks.workunit.client.0.vm04.stderr:+ test_pool_image_args 2026-03-25T15:28:44.057 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing pool and image args...' 2026-03-25T15:28:44.057 INFO:tasks.workunit.client.0.vm04.stdout:testing pool and image args... 2026-03-25T15:28:44.057 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-25T15:28:44.057 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:28:44.264 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:28:44.447 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:28:44.551 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:28:44.653 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:28:44.788 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:28:44.895 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:28:44.984 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:28:45.122 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:28:45.207 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:28:45.290 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:28:45.381 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:28:45.466 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:28:45.543 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:28:45.655 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:28:45.749 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:28:45.829 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:28:45.911 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:28:46.007 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool delete test test --yes-i-really-really-mean-it 2026-03-25T15:28:46.320 INFO:tasks.workunit.client.0.vm04.stderr:pool 'test' does not exist 2026-03-25T15:28:46.333 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool create test 32 2026-03-25T15:28:47.439 INFO:tasks.workunit.client.0.vm04.stderr:pool 'test' already exists 2026-03-25T15:28:47.451 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd pool init test 2026-03-25T15:28:50.303 INFO:tasks.workunit.client.0.vm04.stderr:+ truncate -s 1 /tmp/empty /tmp/empty@snap 2026-03-25T15:28:50.304 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-25T15:28:50.304 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:28:50.304 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 0 2026-03-25T15:28:50.329 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-25T15:28:50.330 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create -s 1 test1 2026-03-25T15:28:50.356 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:28:50.354+0000 7f7194b99300 -1 librbd: Forced V1 image creation. 2026-03-25T15:28:52.310 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-25T15:28:52.310 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -q test1 2026-03-25T15:28:52.334 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd import --image test2 /tmp/empty 2026-03-25T15:28:52.357 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:28:52.355+0000 7fc8817ac300 -1 librbd: Forced V1 image creation. 2026-03-25T15:28:52.366 INFO:tasks.workunit.client.0.vm04.stderr: Importing image: 100% complete...done. 2026-03-25T15:28:52.367 INFO:tasks.workunit.client.0.vm04.stderr:rbd: --image is deprecated, use --dest 2026-03-25T15:28:52.370 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-25T15:28:52.370 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -q test2 2026-03-25T15:28:52.397 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd --dest test3 import /tmp/empty 2026-03-25T15:28:52.422 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:28:52.420+0000 7f1f7764b300 -1 librbd: Forced V1 image creation. 2026-03-25T15:28:52.431 INFO:tasks.workunit.client.0.vm04.stderr: Importing image: 100% complete...done. 2026-03-25T15:28:52.434 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-25T15:28:52.434 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -q test3 2026-03-25T15:28:52.462 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd import /tmp/empty foo 2026-03-25T15:28:52.486 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:28:52.484+0000 7f74f823b300 -1 librbd: Forced V1 image creation. 2026-03-25T15:28:52.494 INFO:tasks.workunit.client.0.vm04.stderr: Importing image: 100% complete...done. 2026-03-25T15:28:52.497 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-25T15:28:52.497 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -q foo 2026-03-25T15:28:52.522 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd import --dest test/empty@snap /tmp/empty 2026-03-25T15:28:52.539 INFO:tasks.workunit.client.0.vm04.stderr:rbd: destination snapshot name specified for a command that doesn't use it 2026-03-25T15:28:52.541 INFO:tasks.workunit.client.0.vm04.stderr:+ true 2026-03-25T15:28:52.541 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd import /tmp/empty test/empty@snap 2026-03-25T15:28:52.557 INFO:tasks.workunit.client.0.vm04.stderr:rbd: destination snapshot name specified for a command that doesn't use it 2026-03-25T15:28:52.558 INFO:tasks.workunit.client.0.vm04.stderr:+ true 2026-03-25T15:28:52.558 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd import --image test/empty@snap /tmp/empty 2026-03-25T15:28:52.575 INFO:tasks.workunit.client.0.vm04.stderr:rbd: destination snapshot name specified for a command that doesn't use it 2026-03-25T15:28:52.575 INFO:tasks.workunit.client.0.vm04.stderr:rbd: --image is deprecated, use --dest 2026-03-25T15:28:52.577 INFO:tasks.workunit.client.0.vm04.stderr:+ true 2026-03-25T15:28:52.577 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd import /tmp/empty@snap 2026-03-25T15:28:52.595 INFO:tasks.workunit.client.0.vm04.stderr:rbd: destination snapshot name specified for a command that doesn't use it 2026-03-25T15:28:52.596 INFO:tasks.workunit.client.0.vm04.stderr:+ true 2026-03-25T15:28:52.597 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls test 2026-03-25T15:28:52.597 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:28:52.597 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 0 2026-03-25T15:28:52.624 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-25T15:28:52.624 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd import /tmp/empty test/test1 2026-03-25T15:28:52.649 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:28:52.647+0000 7fee2cd9e300 -1 librbd: Forced V1 image creation. 2026-03-25T15:28:54.324 INFO:tasks.workunit.client.0.vm04.stderr: Importing image: 100% complete...done. 2026-03-25T15:28:54.327 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls test 2026-03-25T15:28:54.327 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -q test1 2026-03-25T15:28:54.358 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd -p test import /tmp/empty test2 2026-03-25T15:28:54.387 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:28:54.385+0000 7f0f23176300 -1 librbd: Forced V1 image creation. 2026-03-25T15:28:54.396 INFO:tasks.workunit.client.0.vm04.stderr: Importing image: 100% complete...done. 2026-03-25T15:28:54.397 INFO:tasks.workunit.client.0.vm04.stderr:rbd: -p [ --pool ] is deprecated, use --dest-pool 2026-03-25T15:28:54.400 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls test 2026-03-25T15:28:54.400 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -q test2 2026-03-25T15:28:54.432 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd --image test3 -p test import /tmp/empty 2026-03-25T15:28:54.463 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:28:54.460+0000 7f02c20ec300 -1 librbd: Forced V1 image creation. 2026-03-25T15:28:54.471 INFO:tasks.workunit.client.0.vm04.stderr: Importing image: 100% complete...done. 2026-03-25T15:28:54.472 INFO:tasks.workunit.client.0.vm04.stderr:rbd: --image is deprecated, use --dest 2026-03-25T15:28:54.472 INFO:tasks.workunit.client.0.vm04.stderr:rbd: -p [ --pool ] is deprecated, use --dest-pool 2026-03-25T15:28:54.475 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls test 2026-03-25T15:28:54.475 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -q test3 2026-03-25T15:28:54.508 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd --image test4 -p test import /tmp/empty 2026-03-25T15:28:54.536 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:28:54.533+0000 7fe6dd812300 -1 librbd: Forced V1 image creation. 2026-03-25T15:28:54.553 INFO:tasks.workunit.client.0.vm04.stderr: Importing image: 100% complete...done. 2026-03-25T15:28:54.554 INFO:tasks.workunit.client.0.vm04.stderr:rbd: --image is deprecated, use --dest 2026-03-25T15:28:54.554 INFO:tasks.workunit.client.0.vm04.stderr:rbd: -p [ --pool ] is deprecated, use --dest-pool 2026-03-25T15:28:54.557 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls test 2026-03-25T15:28:54.557 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -q test4 2026-03-25T15:28:54.586 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd --dest test5 -p test import /tmp/empty 2026-03-25T15:28:54.613 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:28:54.611+0000 7fb999d10300 -1 librbd: Forced V1 image creation. 2026-03-25T15:28:54.620 INFO:tasks.workunit.client.0.vm04.stderr: Importing image: 100% complete...done. 2026-03-25T15:28:54.621 INFO:tasks.workunit.client.0.vm04.stderr:rbd: -p [ --pool ] is deprecated, use --dest-pool 2026-03-25T15:28:54.624 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls test 2026-03-25T15:28:54.624 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -q test5 2026-03-25T15:28:54.655 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd --dest test6 --dest-pool test import /tmp/empty 2026-03-25T15:28:54.682 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:28:54.679+0000 7fba12649300 -1 librbd: Forced V1 image creation. 2026-03-25T15:28:54.690 INFO:tasks.workunit.client.0.vm04.stderr: Importing image: 100% complete...done. 2026-03-25T15:28:54.693 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls test 2026-03-25T15:28:54.693 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -q test6 2026-03-25T15:28:54.721 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd --image test7 --dest-pool test import /tmp/empty 2026-03-25T15:28:54.746 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:28:54.744+0000 7fba7b030300 -1 librbd: Forced V1 image creation. 2026-03-25T15:28:54.753 INFO:tasks.workunit.client.0.vm04.stderr: Importing image: 100% complete...done. 2026-03-25T15:28:54.754 INFO:tasks.workunit.client.0.vm04.stderr:rbd: --image is deprecated, use --dest 2026-03-25T15:28:54.757 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls test 2026-03-25T15:28:54.757 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -q test7 2026-03-25T15:28:54.785 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd --image test/test8 import /tmp/empty 2026-03-25T15:28:54.810 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:28:54.808+0000 7fa6500de300 -1 librbd: Forced V1 image creation. 2026-03-25T15:28:54.819 INFO:tasks.workunit.client.0.vm04.stderr: Importing image: 100% complete...done. 2026-03-25T15:28:54.820 INFO:tasks.workunit.client.0.vm04.stderr:rbd: --image is deprecated, use --dest 2026-03-25T15:28:54.823 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls test 2026-03-25T15:28:54.823 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -q test8 2026-03-25T15:28:54.850 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd --dest test/test9 import /tmp/empty 2026-03-25T15:28:54.878 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:28:54.876+0000 7f75be7d7300 -1 librbd: Forced V1 image creation. 2026-03-25T15:28:54.885 INFO:tasks.workunit.client.0.vm04.stderr: Importing image: 100% complete...done. 2026-03-25T15:28:54.890 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls test 2026-03-25T15:28:54.890 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -q test9 2026-03-25T15:28:54.918 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd import --pool test /tmp/empty 2026-03-25T15:28:54.945 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:28:54.943+0000 7ffbcee35300 -1 librbd: Forced V1 image creation. 2026-03-25T15:28:54.974 INFO:tasks.workunit.client.0.vm04.stderr: Importing image: 100% complete...done. 2026-03-25T15:28:54.975 INFO:tasks.workunit.client.0.vm04.stderr:rbd: -p [ --pool ] is deprecated, use --dest-pool 2026-03-25T15:28:54.979 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls test 2026-03-25T15:28:54.979 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -q empty 2026-03-25T15:28:55.012 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd copy test/test9 test10 2026-03-25T15:28:55.062 INFO:tasks.workunit.client.0.vm04.stderr: Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-25T15:28:55.066 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls test 2026-03-25T15:28:55.066 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -qv test10 2026-03-25T15:28:55.101 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-25T15:28:55.101 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -q test10 2026-03-25T15:28:55.132 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd copy test/test9 test/test10 2026-03-25T15:28:55.179 INFO:tasks.workunit.client.0.vm04.stderr: Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-25T15:28:55.183 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls test 2026-03-25T15:28:55.183 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -q test10 2026-03-25T15:28:55.211 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd copy --pool test test10 --dest-pool test test11 2026-03-25T15:28:55.257 INFO:tasks.workunit.client.0.vm04.stderr: Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-25T15:28:55.261 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls test 2026-03-25T15:28:55.261 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -q test11 2026-03-25T15:28:55.290 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd copy --dest-pool rbd --pool test test11 test12 2026-03-25T15:28:55.331 INFO:tasks.workunit.client.0.vm04.stderr: Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-25T15:28:55.335 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-25T15:28:55.335 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test12 2026-03-25T15:28:55.363 INFO:tasks.workunit.client.0.vm04.stdout:test12 2026-03-25T15:28:55.363 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls test 2026-03-25T15:28:55.363 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -qv test12 2026-03-25T15:28:55.395 INFO:tasks.workunit.client.0.vm04.stderr:+ rm -f /tmp/empty /tmp/empty@snap 2026-03-25T15:28:55.395 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool delete test test --yes-i-really-really-mean-it 2026-03-25T15:28:56.386 INFO:tasks.workunit.client.0.vm04.stderr:pool 'test' does not exist 2026-03-25T15:28:56.397 INFO:tasks.workunit.client.0.vm04.stderr:+ for f in foo test1 test10 test12 test2 test3 2026-03-25T15:28:56.397 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm foo 2026-03-25T15:28:56.444 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:28:56.447 INFO:tasks.workunit.client.0.vm04.stderr:+ for f in foo test1 test10 test12 test2 test3 2026-03-25T15:28:56.447 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test1 2026-03-25T15:28:56.479 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:28:56.482 INFO:tasks.workunit.client.0.vm04.stderr:+ for f in foo test1 test10 test12 test2 test3 2026-03-25T15:28:56.482 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test10 2026-03-25T15:28:56.537 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:28:56.540 INFO:tasks.workunit.client.0.vm04.stderr:+ for f in foo test1 test10 test12 test2 test3 2026-03-25T15:28:56.540 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test12 2026-03-25T15:28:56.595 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:28:56.598 INFO:tasks.workunit.client.0.vm04.stderr:+ for f in foo test1 test10 test12 test2 test3 2026-03-25T15:28:56.598 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test2 2026-03-25T15:28:56.631 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:28:56.635 INFO:tasks.workunit.client.0.vm04.stderr:+ for f in foo test1 test10 test12 test2 test3 2026-03-25T15:28:56.636 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test3 2026-03-25T15:28:56.703 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:28:56.707 INFO:tasks.workunit.client.0.vm04.stderr:+ test_rename 2026-03-25T15:28:56.707 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing rename...' 2026-03-25T15:28:56.707 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-25T15:28:56.708 INFO:tasks.workunit.client.0.vm04.stdout:testing rename... 2026-03-25T15:28:56.708 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:28:56.787 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:28:56.852 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:28:56.920 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:28:56.982 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:28:57.060 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:28:57.131 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:28:57.203 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:28:57.286 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:28:57.398 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:28:57.471 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:28:57.536 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:28:57.599 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:28:57.675 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:28:57.748 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:28:57.813 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:28:57.931 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:28:58.067 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:28:58.165 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 1 -s 1 foo 2026-03-25T15:28:58.187 INFO:tasks.workunit.client.0.vm04.stderr:rbd: image format 1 is deprecated 2026-03-25T15:28:58.195 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:28:58.194+0000 7f41bd832300 -1 librbd: Forced V1 image creation. 2026-03-25T15:28:58.220 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 -s 1 bar 2026-03-25T15:28:58.262 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rename foo foo2 2026-03-25T15:28:58.334 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rename foo2 bar 2026-03-25T15:28:58.335 INFO:tasks.workunit.client.0.vm04.stderr:+ grep exists 2026-03-25T15:28:58.391 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-25T15:28:58.385+0000 7f248c0dc300 -1 librbd::Operations: rbd image bar already exists 2026-03-25T15:28:58.391 INFO:tasks.workunit.client.0.vm04.stdout:rbd: rename error: (17) File exists 2026-03-25T15:28:58.391 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rename bar bar2 2026-03-25T15:28:58.437 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rename bar2 foo2 2026-03-25T15:28:58.437 INFO:tasks.workunit.client.0.vm04.stderr:+ grep exists 2026-03-25T15:28:58.473 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-25T15:28:58.467+0000 7fda9d48e300 -1 librbd::Operations: rbd image foo2 already exists 2026-03-25T15:28:58.473 INFO:tasks.workunit.client.0.vm04.stdout:rbd: rename error: (17) File exists 2026-03-25T15:28:58.473 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool create rbd2 8 2026-03-25T15:28:59.407 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd2' already exists 2026-03-25T15:28:59.420 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd pool init rbd2 2026-03-25T15:29:01.682 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create -p rbd2 -s 1 foo 2026-03-25T15:29:01.707 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:01.706+0000 7fc4eeb00300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:04.082 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rename rbd2/foo rbd2/bar 2026-03-25T15:29:04.431 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd -p rbd2 ls 2026-03-25T15:29:04.431 INFO:tasks.workunit.client.0.vm04.stderr:+ grep bar 2026-03-25T15:29:04.467 INFO:tasks.workunit.client.0.vm04.stdout:bar 2026-03-25T15:29:04.468 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rename rbd2/bar foo 2026-03-25T15:29:04.639 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rename --pool rbd2 foo bar 2026-03-25T15:29:04.796 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rename rbd2/bar --dest-pool rbd foo 2026-03-25T15:29:04.814 INFO:tasks.workunit.client.0.vm04.stderr:rbd: mv/rename across pools not supported 2026-03-25T15:29:04.814 INFO:tasks.workunit.client.0.vm04.stderr:source pool: rbd2 dest pool: rbd 2026-03-25T15:29:04.815 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rename --pool rbd2 bar --dest-pool rbd2 foo 2026-03-25T15:29:04.872 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd -p rbd2 ls 2026-03-25T15:29:04.873 INFO:tasks.workunit.client.0.vm04.stderr:+ grep foo 2026-03-25T15:29:04.905 INFO:tasks.workunit.client.0.vm04.stdout:foo 2026-03-25T15:29:04.905 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool rm rbd2 rbd2 --yes-i-really-really-mean-it 2026-03-25T15:29:06.314 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd2' does not exist 2026-03-25T15:29:06.328 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-25T15:29:06.328 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:06.694 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:06.785 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:06.860 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:06.941 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:07.121 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:07.237 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:07.339 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:07.451 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:07.567 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:07.637 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:07.721 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:07.844 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:08.334 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:08.431 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:08.539 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:08.637 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:08.757 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:08.841 INFO:tasks.workunit.client.0.vm04.stderr:+ test_ls 2026-03-25T15:29:08.842 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing ls...' 2026-03-25T15:29:08.842 INFO:tasks.workunit.client.0.vm04.stdout:testing ls... 2026-03-25T15:29:08.842 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-25T15:29:08.842 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:08.932 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:09.005 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:09.116 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:09.228 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:09.334 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:09.452 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:09.527 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:09.591 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:09.654 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:09.719 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:09.787 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:09.856 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:09.960 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:10.097 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:10.239 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:10.372 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:10.489 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:10.712 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 1 -s 1 test1 2026-03-25T15:29:10.734 INFO:tasks.workunit.client.0.vm04.stderr:rbd: image format 1 is deprecated 2026-03-25T15:29:10.741 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:10.739+0000 7f604b23f300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:10.974 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 1 -s 1 test2 2026-03-25T15:29:10.999 INFO:tasks.workunit.client.0.vm04.stderr:rbd: image format 1 is deprecated 2026-03-25T15:29:11.006 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:11.004+0000 7f7c399fc300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:11.058 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-25T15:29:11.058 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test1 2026-03-25T15:29:11.091 INFO:tasks.workunit.client.0.vm04.stdout:test1 2026-03-25T15:29:11.091 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-25T15:29:11.091 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test2 2026-03-25T15:29:11.117 INFO:tasks.workunit.client.0.vm04.stdout:test2 2026-03-25T15:29:11.117 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-25T15:29:11.117 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:29:11.117 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 2 2026-03-25T15:29:11.146 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-25T15:29:11.147 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls -l 2026-03-25T15:29:11.147 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'test1.*1 MiB.*1' 2026-03-25T15:29:11.185 INFO:tasks.workunit.client.0.vm04.stdout:test1 1 MiB 1 2026-03-25T15:29:11.186 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls -l 2026-03-25T15:29:11.186 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'test2.*1 MiB.*1' 2026-03-25T15:29:11.215 INFO:tasks.workunit.client.0.vm04.stdout:test2 1 MiB 1 2026-03-25T15:29:11.215 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test1 2026-03-25T15:29:11.289 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:11.294 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test2 2026-03-25T15:29:11.403 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:11.411 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 -s 1 test1 2026-03-25T15:29:11.535 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 -s 1 test2 2026-03-25T15:29:11.619 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-25T15:29:11.619 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test1 2026-03-25T15:29:11.650 INFO:tasks.workunit.client.0.vm04.stdout:test1 2026-03-25T15:29:11.650 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-25T15:29:11.650 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test2 2026-03-25T15:29:11.687 INFO:tasks.workunit.client.0.vm04.stdout:test2 2026-03-25T15:29:11.687 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-25T15:29:11.688 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:29:11.688 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 2 2026-03-25T15:29:11.715 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-25T15:29:11.715 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls -l 2026-03-25T15:29:11.715 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'test1.*1 MiB.*2' 2026-03-25T15:29:11.755 INFO:tasks.workunit.client.0.vm04.stdout:test1 1 MiB 2 2026-03-25T15:29:11.755 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls -l 2026-03-25T15:29:11.755 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'test2.*1 MiB.*2' 2026-03-25T15:29:11.792 INFO:tasks.workunit.client.0.vm04.stdout:test2 1 MiB 2 2026-03-25T15:29:11.792 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test1 2026-03-25T15:29:11.988 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:11.985+0000 7f7d328f9640 0 -- 192.168.123.104:0/4006264663 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x7f7d0c006e80 msgr2=0x7f7d0c00ae10 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:12.192 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:12.196 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test2 2026-03-25T15:29:12.482 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:12.479+0000 7f21f9db6640 0 -- 192.168.123.104:0/828985865 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x5650070f97e0 msgr2=0x5650071fd8f0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:12.482 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:12.486 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 -s 1 test1 2026-03-25T15:29:12.567 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 1 -s 1 test2 2026-03-25T15:29:12.591 INFO:tasks.workunit.client.0.vm04.stderr:rbd: image format 1 is deprecated 2026-03-25T15:29:12.597 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:12.595+0000 7f6e4ebce300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:12.618 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-25T15:29:12.618 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test1 2026-03-25T15:29:12.647 INFO:tasks.workunit.client.0.vm04.stdout:test1 2026-03-25T15:29:12.647 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-25T15:29:12.647 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test2 2026-03-25T15:29:12.675 INFO:tasks.workunit.client.0.vm04.stdout:test2 2026-03-25T15:29:12.676 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-25T15:29:12.676 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 2 2026-03-25T15:29:12.676 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:29:12.706 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-25T15:29:12.706 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls -l 2026-03-25T15:29:12.708 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'test1.*1 MiB.*2' 2026-03-25T15:29:12.743 INFO:tasks.workunit.client.0.vm04.stdout:test1 1 MiB 2 2026-03-25T15:29:12.743 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls -l 2026-03-25T15:29:12.743 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'test2.*1 MiB.*1' 2026-03-25T15:29:12.781 INFO:tasks.workunit.client.0.vm04.stdout:test2 1 MiB 1 2026-03-25T15:29:12.781 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-25T15:29:12.781 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:12.930 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:13.023 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:13.158 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:13.260 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:13.368 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:13.507 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:13.618 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:13.694 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:13.788 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:13.895 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:13.991 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:14.070 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:14.162 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:14.474 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:14.591 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:14.718 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:14.816 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:15.148 INFO:tasks.workunit.client.0.vm04.stderr:++ seq -w 00 99 2026-03-25T15:29:15.151 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:15.151 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.00 -s 1 2026-03-25T15:29:15.183 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:15.181+0000 7f1d5139e300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:15.227 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:15.227 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.01 -s 1 2026-03-25T15:29:15.258 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:15.255+0000 7f7cde878300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:15.286 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:15.286 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.02 -s 1 2026-03-25T15:29:15.314 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:15.312+0000 7f5466502300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:15.341 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:15.341 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.03 -s 1 2026-03-25T15:29:15.367 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:15.365+0000 7f33b23c2300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:15.382 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:15.382 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.04 -s 1 2026-03-25T15:29:15.412 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:15.410+0000 7f286aee5300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:15.435 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:15.435 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.05 -s 1 2026-03-25T15:29:15.464 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:15.462+0000 7f5c44174300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:15.481 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:15.481 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.06 -s 1 2026-03-25T15:29:15.508 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:15.506+0000 7f3471b17300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:15.519 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:15.519 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.07 -s 1 2026-03-25T15:29:15.550 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:15.548+0000 7f866f317300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:15.617 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:15.617 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.08 -s 1 2026-03-25T15:29:15.653 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:15.651+0000 7f031e0e6300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:15.704 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:15.704 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.09 -s 1 2026-03-25T15:29:15.735 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:15.733+0000 7f6b9883f300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:15.782 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:15.782 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.10 -s 1 2026-03-25T15:29:15.817 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:15.815+0000 7f6f66f15300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:15.883 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:15.883 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.11 -s 1 2026-03-25T15:29:16.691 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:16.689+0000 7fe36edb3300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:16.738 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:16.739 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.12 -s 1 2026-03-25T15:29:16.770 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:16.767+0000 7f59d86ec300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:16.807 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:16.807 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.13 -s 1 2026-03-25T15:29:16.842 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:16.839+0000 7ff0a73c6300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:16.967 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:16.967 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.14 -s 1 2026-03-25T15:29:16.993 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:16.991+0000 7f7a8db4a300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:17.024 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:17.024 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.15 -s 1 2026-03-25T15:29:17.056 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:17.054+0000 7f5b6fb41300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:17.085 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:17.086 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.16 -s 1 2026-03-25T15:29:17.121 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:17.118+0000 7fcec6e5f300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:17.171 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:17.171 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.17 -s 1 2026-03-25T15:29:17.202 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:17.200+0000 7fe08c030300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:17.211 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:17.211 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.18 -s 1 2026-03-25T15:29:17.242 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:17.240+0000 7f172e2e2300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:17.304 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:17.304 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.19 -s 1 2026-03-25T15:29:17.339 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:17.337+0000 7fa06acc1300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:17.383 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:17.383 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.20 -s 1 2026-03-25T15:29:17.412 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:17.410+0000 7f7efcb2e300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:17.422 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:17.422 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.21 -s 1 2026-03-25T15:29:17.453 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:17.451+0000 7f4dab0a4300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:17.563 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:17.563 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.22 -s 1 2026-03-25T15:29:17.593 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:17.591+0000 7fe589102300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:17.632 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:17.632 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.23 -s 1 2026-03-25T15:29:17.666 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:17.663+0000 7f705d8e2300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:17.728 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:17.728 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.24 -s 1 2026-03-25T15:29:17.758 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:17.756+0000 7fadaf70a300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:17.804 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:17.804 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.25 -s 1 2026-03-25T15:29:17.837 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:17.834+0000 7feafef2e300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:17.864 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:17.864 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.26 -s 1 2026-03-25T15:29:17.898 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:17.896+0000 7f8c45c24300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:17.968 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:17.969 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.27 -s 1 2026-03-25T15:29:18.016 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:18.014+0000 7fe06eddb300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:18.046 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:18.046 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.28 -s 1 2026-03-25T15:29:18.085 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:18.083+0000 7f9cd27a2300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:18.146 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:18.146 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.29 -s 1 2026-03-25T15:29:18.180 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:18.178+0000 7f9d13342300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:18.239 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:18.239 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.30 -s 1 2026-03-25T15:29:18.271 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:18.269+0000 7fe440f26300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:18.303 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:18.303 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.31 -s 1 2026-03-25T15:29:18.336 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:18.334+0000 7f8603115300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:18.393 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:18.393 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.32 -s 1 2026-03-25T15:29:18.425 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:18.422+0000 7ff69897f300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:18.502 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:18.503 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.33 -s 1 2026-03-25T15:29:18.532 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:18.530+0000 7ff91a64b300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:18.563 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:18.564 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.34 -s 1 2026-03-25T15:29:18.589 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:18.587+0000 7f900ec22300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:18.597 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:18.597 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.35 -s 1 2026-03-25T15:29:18.624 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:18.622+0000 7fb7545d7300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:18.631 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:18.631 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.36 -s 1 2026-03-25T15:29:18.660 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:18.658+0000 7f5781dc6300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:18.667 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:18.667 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.37 -s 1 2026-03-25T15:29:18.693 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:18.691+0000 7f5a548c1300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:18.699 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:18.699 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.38 -s 1 2026-03-25T15:29:18.723 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:18.722+0000 7fba083d9300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:18.729 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:18.729 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.39 -s 1 2026-03-25T15:29:18.755 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:18.753+0000 7f62d0f69300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:18.763 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:18.763 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.40 -s 1 2026-03-25T15:29:18.795 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:18.793+0000 7f0ae55d9300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:18.804 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:18.804 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.41 -s 1 2026-03-25T15:29:18.832 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:18.830+0000 7fdfce4cf300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:18.840 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:18.840 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.42 -s 1 2026-03-25T15:29:18.867 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:18.866+0000 7fee88faf300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:18.876 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:18.876 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.43 -s 1 2026-03-25T15:29:18.905 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:18.903+0000 7f8e218cf300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:18.912 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:18.912 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.44 -s 1 2026-03-25T15:29:18.940 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:18.938+0000 7f2dd634c300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:18.948 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:18.948 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.45 -s 1 2026-03-25T15:29:18.977 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:18.974+0000 7f75160c1300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:18.989 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:18.989 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.46 -s 1 2026-03-25T15:29:19.030 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:19.028+0000 7f56e7c57300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:19.087 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:19.087 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.47 -s 1 2026-03-25T15:29:19.116 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:19.113+0000 7f214419c300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:19.135 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:19.135 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.48 -s 1 2026-03-25T15:29:19.169 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:19.167+0000 7fd3d7ae6300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:19.229 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:19.229 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.49 -s 1 2026-03-25T15:29:19.262 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:19.260+0000 7f0f52399300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:19.315 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:19.315 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.50 -s 1 2026-03-25T15:29:19.345 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:19.343+0000 7f6d4f064300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:19.382 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:19.382 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.51 -s 1 2026-03-25T15:29:19.416 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:19.414+0000 7f54ce1dd300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:19.481 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:19.481 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.52 -s 1 2026-03-25T15:29:19.517 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:19.515+0000 7fa57bbfa300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:19.539 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:19.539 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.53 -s 1 2026-03-25T15:29:19.570 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:19.568+0000 7fc58979e300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:19.596 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:19.597 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.54 -s 1 2026-03-25T15:29:19.626 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:19.624+0000 7fd3c517c300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:19.670 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:19.670 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.55 -s 1 2026-03-25T15:29:19.701 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:19.698+0000 7fd7d2e43300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:19.756 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:19.757 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.56 -s 1 2026-03-25T15:29:19.783 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:19.781+0000 7fb5529f4300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:19.792 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:19.792 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.57 -s 1 2026-03-25T15:29:19.820 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:19.818+0000 7fcf9eb26300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:19.838 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:19.838 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.58 -s 1 2026-03-25T15:29:19.874 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:19.872+0000 7fb1756e8300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:19.922 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:19.923 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.59 -s 1 2026-03-25T15:29:19.953 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:19.951+0000 7fd74da51300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:19.970 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:19.971 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.60 -s 1 2026-03-25T15:29:20.200 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:20.198+0000 7f6fb15eb300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:20.226 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:20.226 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.61 -s 1 2026-03-25T15:29:20.255 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:20.253+0000 7f0297d4a300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:20.306 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:20.306 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.62 -s 1 2026-03-25T15:29:20.334 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:20.332+0000 7fa8f2fb3300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:20.370 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:20.370 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.63 -s 1 2026-03-25T15:29:20.396 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:20.394+0000 7f87a48e0300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:20.404 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:20.405 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.64 -s 1 2026-03-25T15:29:20.436 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:20.434+0000 7f1f7f55a300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:20.478 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:20.478 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.65 -s 1 2026-03-25T15:29:20.508 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:20.506+0000 7f4981c45300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:20.542 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:20.542 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.66 -s 1 2026-03-25T15:29:20.573 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:20.571+0000 7f8dbd376300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:20.611 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:20.611 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.67 -s 1 2026-03-25T15:29:20.641 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:20.639+0000 7f597e908300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:20.670 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:20.670 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.68 -s 1 2026-03-25T15:29:20.697 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:20.695+0000 7f0436a8e300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:20.747 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:20.747 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.69 -s 1 2026-03-25T15:29:20.775 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:20.773+0000 7f1f205d2300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:20.809 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:20.809 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.70 -s 1 2026-03-25T15:29:20.838 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:20.836+0000 7f62092c3300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:20.878 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:20.879 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.71 -s 1 2026-03-25T15:29:20.909 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:20.907+0000 7f4fe4187300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:20.938 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:20.939 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.72 -s 1 2026-03-25T15:29:20.969 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:20.967+0000 7f8743756300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:20.997 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:20.997 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.73 -s 1 2026-03-25T15:29:21.028 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:21.026+0000 7fc2905a4300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:21.068 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:21.069 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.74 -s 1 2026-03-25T15:29:21.100 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:21.098+0000 7f32291f4300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:21.135 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:21.135 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.75 -s 1 2026-03-25T15:29:21.163 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:21.161+0000 7fee88d8f300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:21.203 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:21.204 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.76 -s 1 2026-03-25T15:29:21.235 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:21.233+0000 7f8fea7f2300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:21.252 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:21.252 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.77 -s 1 2026-03-25T15:29:21.280 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:21.278+0000 7fb1ed23b300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:21.307 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:21.307 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.78 -s 1 2026-03-25T15:29:21.335 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:21.333+0000 7f8a32764300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:21.361 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:21.361 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.79 -s 1 2026-03-25T15:29:21.388 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:21.386+0000 7f70e8528300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:21.401 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:21.401 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.80 -s 1 2026-03-25T15:29:21.437 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:21.435+0000 7f33974dc300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:21.483 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:21.483 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.81 -s 1 2026-03-25T15:29:21.513 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:21.511+0000 7ff6306c5300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:21.562 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:21.562 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.82 -s 1 2026-03-25T15:29:21.593 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:21.591+0000 7f3d11700300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:21.640 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:21.640 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.83 -s 1 2026-03-25T15:29:21.669 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:21.667+0000 7f617c861300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:21.697 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:21.697 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.84 -s 1 2026-03-25T15:29:21.726 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:21.724+0000 7f2f4e6a2300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:21.736 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:21.736 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.85 -s 1 2026-03-25T15:29:21.769 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:21.767+0000 7f697e94c300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:21.796 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:21.796 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.86 -s 1 2026-03-25T15:29:21.828 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:21.826+0000 7fb2125d2300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:21.853 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:21.853 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.87 -s 1 2026-03-25T15:29:21.883 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:21.881+0000 7fe215f8b300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:21.907 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:21.907 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.88 -s 1 2026-03-25T15:29:21.942 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:21.940+0000 7fd0afeec300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:21.959 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:21.959 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.89 -s 1 2026-03-25T15:29:21.992 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:21.989+0000 7f7674543300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:22.006 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:22.007 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.90 -s 1 2026-03-25T15:29:22.038 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:22.036+0000 7ff883daf300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:22.068 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:22.068 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.91 -s 1 2026-03-25T15:29:22.101 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:22.099+0000 7ff9e5dcc300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:22.141 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:22.142 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.92 -s 1 2026-03-25T15:29:22.370 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:22.368+0000 7fe78efdd640 0 --2- 192.168.123.104:0/3405581027 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x55e2befd4fb0 0x55e2befc7120 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 2026-03-25T15:29:22.373 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:22.372+0000 7fe790528300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:22.385 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:22.385 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.93 -s 1 2026-03-25T15:29:22.414 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:22.412+0000 7fb31d6ec300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:22.423 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:22.423 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.94 -s 1 2026-03-25T15:29:22.456 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:22.455+0000 7fc354717300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:22.476 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:22.476 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.95 -s 1 2026-03-25T15:29:22.506 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:22.504+0000 7f00d071f300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:22.515 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:22.515 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.96 -s 1 2026-03-25T15:29:22.544 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:22.542+0000 7fc68becb300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:22.560 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:22.561 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.97 -s 1 2026-03-25T15:29:22.592 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:22.590+0000 7f68a4c30300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:22.623 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:22.623 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.98 -s 1 2026-03-25T15:29:22.656 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:22.654+0000 7f4d5b6c3300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:22.685 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:22.685 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.99 -s 1 2026-03-25T15:29:22.715 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:22.713+0000 7f214c94a300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:22.775 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-25T15:29:22.775 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:29:22.775 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 100 2026-03-25T15:29:22.806 INFO:tasks.workunit.client.0.vm04.stdout:100 2026-03-25T15:29:22.806 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls -l 2026-03-25T15:29:22.806 INFO:tasks.workunit.client.0.vm04.stderr:+ grep image 2026-03-25T15:29:22.807 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:29:22.807 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 100 2026-03-25T15:29:22.874 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:22.869+0000 7f6e507ce640 0 -- 192.168.123.104:0/415053373 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x55a254563cf0 msgr2=0x55a25457b4c0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-25T15:29:22.883 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:22.881+0000 7f6e507ce640 0 -- 192.168.123.104:0/415053373 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7f6e3005ef70 msgr2=0x7f6e3007f350 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:23.016 INFO:tasks.workunit.client.0.vm04.stdout:100 2026-03-25T15:29:23.016 INFO:tasks.workunit.client.0.vm04.stderr:++ seq -w 00 99 2026-03-25T15:29:23.017 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:23.017 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.00 2026-03-25T15:29:23.060 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:23.064 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:23.064 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.01 2026-03-25T15:29:23.109 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:23.112 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:23.112 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.02 2026-03-25T15:29:23.196 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:23.203 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:23.203 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.03 2026-03-25T15:29:23.277 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:23.282 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:23.282 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.04 2026-03-25T15:29:23.324 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:23.330 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:23.330 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.05 2026-03-25T15:29:23.398 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:23.405 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:23.405 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.06 2026-03-25T15:29:23.468 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:23.473 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:23.473 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.07 2026-03-25T15:29:23.563 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:23.572 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:23.572 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.08 2026-03-25T15:29:23.635 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:23.641 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:23.641 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.09 2026-03-25T15:29:23.716 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:23.721 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:23.721 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.10 2026-03-25T15:29:23.787 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:23.792 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:23.793 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.11 2026-03-25T15:29:23.856 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:23.862 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:23.862 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.12 2026-03-25T15:29:23.904 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:23.908 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:23.908 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.13 2026-03-25T15:29:24.013 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:24.018 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:24.018 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.14 2026-03-25T15:29:24.069 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:24.074 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:24.074 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.15 2026-03-25T15:29:24.126 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:24.131 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:24.131 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.16 2026-03-25T15:29:24.178 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:24.182 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:24.182 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.17 2026-03-25T15:29:24.244 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:24.248 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:24.248 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.18 2026-03-25T15:29:24.310 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:24.315 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:24.315 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.19 2026-03-25T15:29:24.372 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:24.378 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:24.378 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.20 2026-03-25T15:29:24.437 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:24.442 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:24.442 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.21 2026-03-25T15:29:24.499 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:24.504 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:24.504 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.22 2026-03-25T15:29:24.554 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:24.559 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:24.559 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.23 2026-03-25T15:29:24.624 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:24.628 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:24.628 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.24 2026-03-25T15:29:24.674 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:24.680 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:24.680 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.25 2026-03-25T15:29:24.728 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:24.732 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:24.732 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.26 2026-03-25T15:29:24.789 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:24.793 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:24.793 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.27 2026-03-25T15:29:24.834 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:24.838 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:24.838 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.28 2026-03-25T15:29:24.919 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:24.924 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:24.924 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.29 2026-03-25T15:29:25.028 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:25.034 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:25.034 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.30 2026-03-25T15:29:25.086 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:25.090 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:25.090 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.31 2026-03-25T15:29:25.139 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:25.144 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:25.144 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.32 2026-03-25T15:29:25.191 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:25.194 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:25.194 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.33 2026-03-25T15:29:25.258 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:25.260 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:25.261 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.34 2026-03-25T15:29:25.332 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:25.336 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:25.337 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.35 2026-03-25T15:29:25.398 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:25.403 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:25.403 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.36 2026-03-25T15:29:25.459 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:25.463 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:25.463 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.37 2026-03-25T15:29:25.516 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:25.521 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:25.521 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.38 2026-03-25T15:29:25.596 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:25.600 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:25.600 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.39 2026-03-25T15:29:25.635 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:25.638 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:25.638 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.40 2026-03-25T15:29:25.673 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:25.676 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:25.676 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.41 2026-03-25T15:29:25.708 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:25.711 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:25.711 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.42 2026-03-25T15:29:25.743 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:25.747 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:25.747 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.43 2026-03-25T15:29:25.779 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:25.782 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:25.782 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.44 2026-03-25T15:29:25.815 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:25.818 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:25.818 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.45 2026-03-25T15:29:25.850 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:25.854 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:25.854 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.46 2026-03-25T15:29:25.921 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:25.926 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:25.926 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.47 2026-03-25T15:29:25.960 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:25.964 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:25.964 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.48 2026-03-25T15:29:26.002 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:26.005 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:26.005 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.49 2026-03-25T15:29:26.049 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:26.053 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:26.053 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.50 2026-03-25T15:29:26.101 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:26.105 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:26.105 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.51 2026-03-25T15:29:26.150 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:26.155 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:26.155 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.52 2026-03-25T15:29:26.204 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:26.208 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:26.208 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.53 2026-03-25T15:29:26.262 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:26.267 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:26.267 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.54 2026-03-25T15:29:26.307 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:26.313 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:26.313 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.55 2026-03-25T15:29:26.364 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:26.368 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:26.368 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.56 2026-03-25T15:29:26.457 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:26.465 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:26.465 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.57 2026-03-25T15:29:26.518 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:26.522 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:26.522 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.58 2026-03-25T15:29:26.587 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:26.592 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:26.596 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.59 2026-03-25T15:29:26.655 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:26.660 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:26.660 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.60 2026-03-25T15:29:26.734 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:26.737 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:26.738 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.61 2026-03-25T15:29:26.789 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:26.793 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:26.793 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.62 2026-03-25T15:29:26.843 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:26.847 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:26.847 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.63 2026-03-25T15:29:26.888 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:26.892 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:26.892 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.64 2026-03-25T15:29:26.974 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:26.979 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:26.980 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.65 2026-03-25T15:29:27.059 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:27.066 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:27.066 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.66 2026-03-25T15:29:27.127 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:27.131 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:27.132 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.67 2026-03-25T15:29:27.193 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:27.199 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:27.199 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.68 2026-03-25T15:29:27.255 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:27.260 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:27.260 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.69 2026-03-25T15:29:27.323 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:27.329 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:27.329 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.70 2026-03-25T15:29:27.407 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:27.412 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:27.412 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.71 2026-03-25T15:29:27.481 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:27.485 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:27.485 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.72 2026-03-25T15:29:27.724 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:27.729 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:27.729 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.73 2026-03-25T15:29:27.771 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:27.776 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:27.780 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.74 2026-03-25T15:29:27.831 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:27.836 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:27.836 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.75 2026-03-25T15:29:27.901 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:27.905 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:27.905 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.76 2026-03-25T15:29:27.966 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:27.970 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:27.970 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.77 2026-03-25T15:29:28.033 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:28.036 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:28.036 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.78 2026-03-25T15:29:28.074 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:28.078 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:28.078 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.79 2026-03-25T15:29:28.143 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:28.148 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:28.148 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.80 2026-03-25T15:29:28.192 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:28.198 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:28.198 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.81 2026-03-25T15:29:28.250 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:28.254 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:28.255 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.82 2026-03-25T15:29:28.319 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:28.323 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:28.323 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.83 2026-03-25T15:29:28.366 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:28.371 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:28.372 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.84 2026-03-25T15:29:28.424 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:28.429 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:28.430 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.85 2026-03-25T15:29:28.466 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:28.469 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:28.470 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.86 2026-03-25T15:29:28.530 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:28.535 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:28.535 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.87 2026-03-25T15:29:28.576 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:28.580 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:28.580 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.88 2026-03-25T15:29:28.638 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:28.642 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:28.643 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.89 2026-03-25T15:29:28.838 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:28.842 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:28.842 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.90 2026-03-25T15:29:28.968 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:28.974 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:28.974 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.91 2026-03-25T15:29:29.304 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:29.310 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:29.311 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.92 2026-03-25T15:29:29.376 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:29.380 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:29.380 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.93 2026-03-25T15:29:29.424 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:29.428 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:29.428 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.94 2026-03-25T15:29:29.482 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:29.486 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:29.486 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.95 2026-03-25T15:29:29.553 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:29.557 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:29.557 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.96 2026-03-25T15:29:29.601 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:29.606 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:29.606 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.97 2026-03-25T15:29:29.649 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:29.653 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:29.653 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.98 2026-03-25T15:29:29.720 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:29.725 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:29.725 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.99 2026-03-25T15:29:29.802 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:29.807 INFO:tasks.workunit.client.0.vm04.stderr:++ seq -w 00 99 2026-03-25T15:29:29.808 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:29.808 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.00 --image-format 2 -s 1 2026-03-25T15:29:29.861 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:29.861 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.01 --image-format 2 -s 1 2026-03-25T15:29:29.929 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:29.929 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.02 --image-format 2 -s 1 2026-03-25T15:29:29.997 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:29.997 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.03 --image-format 2 -s 1 2026-03-25T15:29:30.062 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:30.063 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.04 --image-format 2 -s 1 2026-03-25T15:29:30.141 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:30.141 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.05 --image-format 2 -s 1 2026-03-25T15:29:30.187 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:30.187 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.06 --image-format 2 -s 1 2026-03-25T15:29:30.260 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:30.260 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.07 --image-format 2 -s 1 2026-03-25T15:29:30.570 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:30.570 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.08 --image-format 2 -s 1 2026-03-25T15:29:30.645 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:30.646 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.09 --image-format 2 -s 1 2026-03-25T15:29:30.869 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:30.869 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.10 --image-format 2 -s 1 2026-03-25T15:29:30.913 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:30.913 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.11 --image-format 2 -s 1 2026-03-25T15:29:30.996 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:30.997 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.12 --image-format 2 -s 1 2026-03-25T15:29:31.073 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:31.073 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.13 --image-format 2 -s 1 2026-03-25T15:29:31.261 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:31.261 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.14 --image-format 2 -s 1 2026-03-25T15:29:31.557 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:31.557 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.15 --image-format 2 -s 1 2026-03-25T15:29:31.622 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:31.623 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.16 --image-format 2 -s 1 2026-03-25T15:29:31.756 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:31.756 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.17 --image-format 2 -s 1 2026-03-25T15:29:31.836 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:31.836 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.18 --image-format 2 -s 1 2026-03-25T15:29:31.965 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:31.965 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.19 --image-format 2 -s 1 2026-03-25T15:29:32.034 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:32.034 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.20 --image-format 2 -s 1 2026-03-25T15:29:32.119 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:32.119 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.21 --image-format 2 -s 1 2026-03-25T15:29:32.201 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:32.201 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.22 --image-format 2 -s 1 2026-03-25T15:29:32.242 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:32.243 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.23 --image-format 2 -s 1 2026-03-25T15:29:32.322 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:32.322 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.24 --image-format 2 -s 1 2026-03-25T15:29:32.401 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:32.401 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.25 --image-format 2 -s 1 2026-03-25T15:29:32.463 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:32.463 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.26 --image-format 2 -s 1 2026-03-25T15:29:32.509 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:32.509 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.27 --image-format 2 -s 1 2026-03-25T15:29:32.574 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:32.574 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.28 --image-format 2 -s 1 2026-03-25T15:29:32.669 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:32.669 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.29 --image-format 2 -s 1 2026-03-25T15:29:32.741 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:32.741 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.30 --image-format 2 -s 1 2026-03-25T15:29:32.810 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:32.810 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.31 --image-format 2 -s 1 2026-03-25T15:29:32.889 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:32.890 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.32 --image-format 2 -s 1 2026-03-25T15:29:32.991 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:32.991 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.33 --image-format 2 -s 1 2026-03-25T15:29:33.084 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:33.084 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.34 --image-format 2 -s 1 2026-03-25T15:29:33.163 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:33.163 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.35 --image-format 2 -s 1 2026-03-25T15:29:33.256 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:33.256 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.36 --image-format 2 -s 1 2026-03-25T15:29:33.294 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:33.294 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.37 --image-format 2 -s 1 2026-03-25T15:29:33.534 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:33.534 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.38 --image-format 2 -s 1 2026-03-25T15:29:33.576 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:33.576 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.39 --image-format 2 -s 1 2026-03-25T15:29:33.618 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:33.618 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.40 --image-format 2 -s 1 2026-03-25T15:29:33.683 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:33.683 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.41 --image-format 2 -s 1 2026-03-25T15:29:33.724 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:33.724 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.42 --image-format 2 -s 1 2026-03-25T15:29:33.761 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:33.761 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.43 --image-format 2 -s 1 2026-03-25T15:29:33.802 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:33.802 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.44 --image-format 2 -s 1 2026-03-25T15:29:33.846 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:33.846 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.45 --image-format 2 -s 1 2026-03-25T15:29:33.887 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:33.887 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.46 --image-format 2 -s 1 2026-03-25T15:29:33.929 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:33.929 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.47 --image-format 2 -s 1 2026-03-25T15:29:33.970 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:33.971 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.48 --image-format 2 -s 1 2026-03-25T15:29:34.011 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:34.011 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.49 --image-format 2 -s 1 2026-03-25T15:29:34.048 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:34.048 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.50 --image-format 2 -s 1 2026-03-25T15:29:34.093 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:34.093 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.51 --image-format 2 -s 1 2026-03-25T15:29:34.133 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:34.134 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.52 --image-format 2 -s 1 2026-03-25T15:29:34.175 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:34.175 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.53 --image-format 2 -s 1 2026-03-25T15:29:34.213 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:34.213 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.54 --image-format 2 -s 1 2026-03-25T15:29:34.249 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:34.249 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.55 --image-format 2 -s 1 2026-03-25T15:29:34.289 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:34.289 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.56 --image-format 2 -s 1 2026-03-25T15:29:34.328 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:34.328 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.57 --image-format 2 -s 1 2026-03-25T15:29:34.368 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:34.368 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.58 --image-format 2 -s 1 2026-03-25T15:29:34.406 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:34.406 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.59 --image-format 2 -s 1 2026-03-25T15:29:34.451 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:34.451 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.60 --image-format 2 -s 1 2026-03-25T15:29:34.493 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:34.493 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.61 --image-format 2 -s 1 2026-03-25T15:29:34.531 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:34.531 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.62 --image-format 2 -s 1 2026-03-25T15:29:34.570 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:34.570 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.63 --image-format 2 -s 1 2026-03-25T15:29:34.609 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:34.609 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.64 --image-format 2 -s 1 2026-03-25T15:29:34.650 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:34.650 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.65 --image-format 2 -s 1 2026-03-25T15:29:34.689 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:34.689 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.66 --image-format 2 -s 1 2026-03-25T15:29:34.726 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:34.726 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.67 --image-format 2 -s 1 2026-03-25T15:29:34.761 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:34.761 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.68 --image-format 2 -s 1 2026-03-25T15:29:34.800 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:34.800 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.69 --image-format 2 -s 1 2026-03-25T15:29:34.842 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:34.842 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.70 --image-format 2 -s 1 2026-03-25T15:29:34.884 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:34.884 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.71 --image-format 2 -s 1 2026-03-25T15:29:34.926 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:34.926 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.72 --image-format 2 -s 1 2026-03-25T15:29:34.962 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:34.962 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.73 --image-format 2 -s 1 2026-03-25T15:29:34.996 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:34.996 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.74 --image-format 2 -s 1 2026-03-25T15:29:35.031 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:35.031 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.75 --image-format 2 -s 1 2026-03-25T15:29:35.065 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:35.066 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.76 --image-format 2 -s 1 2026-03-25T15:29:35.104 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:35.104 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.77 --image-format 2 -s 1 2026-03-25T15:29:35.143 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:35.143 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.78 --image-format 2 -s 1 2026-03-25T15:29:35.182 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:35.182 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.79 --image-format 2 -s 1 2026-03-25T15:29:35.216 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:35.216 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.80 --image-format 2 -s 1 2026-03-25T15:29:35.250 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:35.250 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.81 --image-format 2 -s 1 2026-03-25T15:29:35.285 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:35.285 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.82 --image-format 2 -s 1 2026-03-25T15:29:35.318 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:35.319 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.83 --image-format 2 -s 1 2026-03-25T15:29:35.352 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:35.352 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.84 --image-format 2 -s 1 2026-03-25T15:29:35.385 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:35.385 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.85 --image-format 2 -s 1 2026-03-25T15:29:35.420 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:35.420 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.86 --image-format 2 -s 1 2026-03-25T15:29:35.459 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:35.459 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.87 --image-format 2 -s 1 2026-03-25T15:29:35.496 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:35.496 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.88 --image-format 2 -s 1 2026-03-25T15:29:35.533 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:35.533 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.89 --image-format 2 -s 1 2026-03-25T15:29:35.574 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:35.574 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.90 --image-format 2 -s 1 2026-03-25T15:29:35.610 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:35.610 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.91 --image-format 2 -s 1 2026-03-25T15:29:35.643 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:35.643 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.92 --image-format 2 -s 1 2026-03-25T15:29:35.678 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:35.679 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.93 --image-format 2 -s 1 2026-03-25T15:29:35.720 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:35.720 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.94 --image-format 2 -s 1 2026-03-25T15:29:35.758 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:35.759 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.95 --image-format 2 -s 1 2026-03-25T15:29:35.795 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:35.795 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.96 --image-format 2 -s 1 2026-03-25T15:29:35.836 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:35.836 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.97 --image-format 2 -s 1 2026-03-25T15:29:35.876 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:35.876 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.98 --image-format 2 -s 1 2026-03-25T15:29:35.993 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:35.993 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create image.99 --image-format 2 -s 1 2026-03-25T15:29:36.096 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-25T15:29:36.097 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:29:36.097 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 100 2026-03-25T15:29:36.125 INFO:tasks.workunit.client.0.vm04.stdout:100 2026-03-25T15:29:36.126 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls -l 2026-03-25T15:29:36.126 INFO:tasks.workunit.client.0.vm04.stderr:+ grep image 2026-03-25T15:29:36.126 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:29:36.126 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 100 2026-03-25T15:29:36.168 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:36.167+0000 7fc535f10640 0 -- 192.168.123.104:0/3229799951 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x5639921bd790 msgr2=0x5639922561a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:36.171 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:36.170+0000 7fc535f10640 0 -- 192.168.123.104:0/3229799951 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7fc51405ee50 msgr2=0x7fc51407f230 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-25T15:29:36.340 INFO:tasks.workunit.client.0.vm04.stdout:100 2026-03-25T15:29:36.340 INFO:tasks.workunit.client.0.vm04.stderr:++ seq -w 00 99 2026-03-25T15:29:36.341 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:36.341 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.00 2026-03-25T15:29:36.415 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:36.419 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:36.419 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.01 2026-03-25T15:29:36.518 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:36.516+0000 7fc2bb028640 0 -- 192.168.123.104:0/3930796479 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x558470656cc0 msgr2=0x558470636da0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:36.520 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:36.524 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:36.524 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.02 2026-03-25T15:29:36.594 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:36.592+0000 7f54ed5b6640 0 -- 192.168.123.104:0/186530020 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7f54cc05ee50 msgr2=0x7f54cc07f230 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:36.603 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:36.607 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:36.607 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.03 2026-03-25T15:29:36.742 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:36.740+0000 7f1c9f535640 0 -- 192.168.123.104:0/1753432238 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x7f1c7800dd20 msgr2=0x7f1c78004cb0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:36.948 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:36.954 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:36.954 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.04 2026-03-25T15:29:37.031 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:37.029+0000 7f318cf32640 0 -- 192.168.123.104:0/1305279654 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x55866a3dfcc0 msgr2=0x55866a4316d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:37.038 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:37.041 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:37.041 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.05 2026-03-25T15:29:37.107 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:37.105+0000 7ff34d156640 0 -- 192.168.123.104:0/35242847 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x7ff32c05ef70 msgr2=0x7ff32c07f350 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:37.116 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:37.121 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:37.121 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.06 2026-03-25T15:29:37.201 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:37.199+0000 7f4c3e9e4640 0 -- 192.168.123.104:0/2115359040 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x55e473e109b0 msgr2=0x55e473f143d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:37.206 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:37.210 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:37.210 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.07 2026-03-25T15:29:37.276 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:37.275+0000 7f757b96c640 0 -- 192.168.123.104:0/3744732813 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x7f755c05ef70 msgr2=0x7f755c07f350 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:37.283 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:37.287 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:37.287 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.08 2026-03-25T15:29:37.366 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:37.369 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:37.369 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.09 2026-03-25T15:29:37.490 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:37.488+0000 7f55ef20d640 0 -- 192.168.123.104:0/3989196980 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x5596a6351cc0 msgr2=0x5596a63a36d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:37.491 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:37.494 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:37.494 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.10 2026-03-25T15:29:37.572 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:37.570+0000 7fb0a9cf8640 0 -- 192.168.123.104:0/112580105 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x562571251cc0 msgr2=0x56257122d830 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:37.572 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:37.575 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:37.575 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.11 2026-03-25T15:29:37.698 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:37.703 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:37.703 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.12 2026-03-25T15:29:37.832 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:37.830+0000 7f4c2b69a640 0 -- 192.168.123.104:0/1584379328 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x558ca5982030 msgr2=0x558ca5972370 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:37.832 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:37.837 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:37.837 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.13 2026-03-25T15:29:37.926 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:37.924+0000 7f201f938640 0 -- 192.168.123.104:0/2809382797 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x556405ded030 msgr2=0x556405ddd370 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:37.928 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:37.932 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:37.932 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.14 2026-03-25T15:29:38.093 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:38.091+0000 7f2468d8a640 0 -- 192.168.123.104:0/800142735 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x7f24440011e0 msgr2=0x7f24440271f0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:38.095 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:38.099 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:38.099 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.15 2026-03-25T15:29:38.210 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:38.208+0000 7efd84741640 0 -- 192.168.123.104:0/3500274805 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x555d36d3f030 msgr2=0x555d36d2f370 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:38.210 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:38.214 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:38.214 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.16 2026-03-25T15:29:38.301 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:38.299+0000 7f010e8d7640 0 -- 192.168.123.104:0/1784537139 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x555ca90777e0 msgr2=0x555ca917b880 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:38.302 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:38.305 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:38.305 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.17 2026-03-25T15:29:38.417 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:38.415+0000 7fc60e688640 0 -- 192.168.123.104:0/3996159715 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x55da929c1030 msgr2=0x55da929b1370 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:38.417 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:38.420 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:38.420 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.18 2026-03-25T15:29:38.632 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:38.630+0000 7f25b2349640 0 -- 192.168.123.104:0/201824909 >> [v2:192.168.123.104:6816/2768043570,v1:192.168.123.104:6817/2768043570] conn(0x55f170b03690 msgr2=0x55f170b23b10 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:38.637 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:38.641 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:38.641 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.19 2026-03-25T15:29:38.731 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:38.735 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:38.735 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.20 2026-03-25T15:29:38.825 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:38.823+0000 7f1dd7621640 0 -- 192.168.123.104:0/1609151129 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x55fd1b1a4030 msgr2=0x55fd1b194820 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:38.826 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:38.829 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:38.829 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.21 2026-03-25T15:29:38.903 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:38.901+0000 7f0193aef640 0 -- 192.168.123.104:0/3275827842 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x55b60408fc30 msgr2=0x55b6040e6790 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:38.903 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:38.907 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:38.907 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.22 2026-03-25T15:29:38.984 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:38.982+0000 7f93d1e48640 0 -- 192.168.123.104:0/3812900108 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x55db3acf5030 msgr2=0x55db3ace5370 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:38.987 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:38.991 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:38.992 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.23 2026-03-25T15:29:39.147 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:39.144+0000 7f0068694640 0 -- 192.168.123.104:0/1866378381 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x5627dd1fd690 msgr2=0x5627dd316960 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:39.150 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:39.154 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:39.154 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.24 2026-03-25T15:29:39.529 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:39.527+0000 7f51a96e0640 0 -- 192.168.123.104:0/1760560838 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x564b161917e0 msgr2=0x564b162953d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:39.541 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:39.545 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:39.545 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.25 2026-03-25T15:29:39.627 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:39.625+0000 7efde1fdc640 0 -- 192.168.123.104:0/815188830 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x556a20a8fcc0 msgr2=0x556a20a6b3d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-25T15:29:39.629 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:39.633 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:39.633 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.26 2026-03-25T15:29:39.712 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:39.710+0000 7fecf3fff640 0 -- 192.168.123.104:0/422595776 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x7fecd0012f20 msgr2=0x7fecd0013390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:39.714 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:39.718 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:39.718 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.27 2026-03-25T15:29:39.789 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:39.793 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:39.793 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.28 2026-03-25T15:29:39.872 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:39.870+0000 7ff87692b640 0 -- 192.168.123.104:0/1000776106 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x55c512300c30 msgr2=0x55c5123575a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:39.874 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:39.878 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:39.878 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.29 2026-03-25T15:29:39.957 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:39.960 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:39.960 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.30 2026-03-25T15:29:40.063 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:40.061+0000 7f7d1445d640 0 -- 192.168.123.104:0/12292836 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x561671c297e0 msgr2=0x561671d2d3d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:40.064 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:40.068 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:40.069 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.31 2026-03-25T15:29:40.142 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:40.145 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:40.145 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.32 2026-03-25T15:29:40.216 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:40.215+0000 7fa30e5e0640 0 -- 192.168.123.104:0/1216746446 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x55875f89ec30 msgr2=0x55875f8f5580 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:40.221 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:40.225 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:40.225 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.33 2026-03-25T15:29:40.289 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:40.288+0000 7f8ddc58f640 0 -- 192.168.123.104:0/187645610 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x55b07ae98cc0 msgr2=0x55b07ae74550 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-25T15:29:40.292 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:40.295 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:40.295 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.34 2026-03-25T15:29:40.368 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:40.366+0000 7fdf79db0640 0 -- 192.168.123.104:0/1527459561 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x55cebbb17030 msgr2=0x55cebbb07370 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:40.368 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:40.371 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:40.371 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.35 2026-03-25T15:29:40.434 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:40.432+0000 7fe1d1ed4640 0 -- 192.168.123.104:0/863420286 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x7fe1b4008d20 msgr2=0x7fe1b40291a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:40.440 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:40.444 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:40.444 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.36 2026-03-25T15:29:40.508 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:40.506+0000 7f388ec57640 0 -- 192.168.123.104:0/3544287878 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x55e878344030 msgr2=0x55e878334200 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-25T15:29:40.510 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:40.514 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:40.514 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.37 2026-03-25T15:29:40.590 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:40.588+0000 7f8fd3145640 0 -- 192.168.123.104:0/3257641929 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x560aab2a5030 msgr2=0x560aab299a30 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:40.591 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:40.595 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:40.595 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.38 2026-03-25T15:29:40.674 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:40.678 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:40.678 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.39 2026-03-25T15:29:40.760 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:40.764 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:40.764 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.40 2026-03-25T15:29:40.843 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:40.841+0000 7fc72b7fe640 0 -- 192.168.123.104:0/2061684491 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x7fc71005ef70 msgr2=0x7fc71007f350 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-25T15:29:40.850 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:40.854 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:40.854 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.41 2026-03-25T15:29:40.921 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:40.919+0000 7f8bfcd30640 0 -- 192.168.123.104:0/1161571630 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7f8bdc05ef70 msgr2=0x7f8bdc07f350 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:40.926 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:40.930 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:40.930 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.42 2026-03-25T15:29:40.996 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:40.994+0000 7f9222291640 0 -- 192.168.123.104:0/2168902739 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x7f91fc012f10 msgr2=0x7f91fc013380 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:40.998 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:41.002 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:41.002 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.43 2026-03-25T15:29:41.063 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:41.061+0000 7f547277e640 0 -- 192.168.123.104:0/274807497 >> [v2:192.168.123.104:6816/2768043570,v1:192.168.123.104:6817/2768043570] conn(0x7f545005ee30 msgr2=0x7f545007f210 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:41.067 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:41.071 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:41.071 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.44 2026-03-25T15:29:41.134 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:41.137 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:41.137 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.45 2026-03-25T15:29:41.204 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:41.203+0000 7f4b66900640 0 -- 192.168.123.104:0/3469270035 >> [v2:192.168.123.104:6816/2768043570,v1:192.168.123.104:6817/2768043570] conn(0x5602760fd360 msgr2=0x56027611d7e0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:41.207 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:41.210 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:41.210 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.46 2026-03-25T15:29:41.284 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:41.282+0000 7f35b472e640 0 -- 192.168.123.104:0/2952759096 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7f359405eea0 msgr2=0x7f359407f280 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-25T15:29:41.287 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:41.290 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:41.290 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.47 2026-03-25T15:29:41.359 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:41.357+0000 7fb722df2640 0 -- 192.168.123.104:0/1636549657 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x560c89591c30 msgr2=0x560c895e8790 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:41.359 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:41.363 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:41.363 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.48 2026-03-25T15:29:41.425 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:41.423+0000 7f1cff216640 0 -- 192.168.123.104:0/2718594004 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x7f1cd8008d20 msgr2=0x7f1cd80291a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:41.428 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:41.431 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:41.432 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.49 2026-03-25T15:29:41.505 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:41.509 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:41.509 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.50 2026-03-25T15:29:41.586 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:41.584+0000 7f31f40ba640 0 -- 192.168.123.104:0/600678812 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x564116d72030 msgr2=0x564116d66aa0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-25T15:29:41.587 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:41.590 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:41.590 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.51 2026-03-25T15:29:41.869 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:41.867+0000 7f5687ea4640 0 -- 192.168.123.104:0/3244814843 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x558d2ceb6cc0 msgr2=0x558d2ce92310 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-25T15:29:41.873 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:41.877 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:41.877 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.52 2026-03-25T15:29:42.053 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:42.056 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:42.056 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.53 2026-03-25T15:29:42.137 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:42.135+0000 7fb78f8d3640 0 -- 192.168.123.104:0/313993112 >> [v2:192.168.123.104:6816/2768043570,v1:192.168.123.104:6817/2768043570] conn(0x55dc0692de90 msgr2=0x55dc0694e310 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:42.140 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:42.145 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:42.145 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.54 2026-03-25T15:29:42.236 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:42.235+0000 7feb18fac640 0 -- 192.168.123.104:0/2855006780 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x5568f6df4c30 msgr2=0x5568f6dd3060 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:42.237 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:42.240 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:42.240 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.55 2026-03-25T15:29:42.304 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:42.302+0000 7f436d264640 0 -- 192.168.123.104:0/2476506608 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7f434c05ef00 msgr2=0x7f434c07f2e0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:42.309 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:42.312 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:42.312 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.56 2026-03-25T15:29:42.378 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:42.376+0000 7fe6b5d6e640 0 -- 192.168.123.104:0/3865115925 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x556eb8632cc0 msgr2=0x556eb860e310 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:42.378 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:42.382 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:42.382 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.57 2026-03-25T15:29:42.446 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:42.444+0000 7f82ad251640 0 -- 192.168.123.104:0/3409435422 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x565549b02cc0 msgr2=0x565549ade830 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:42.446 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:42.449 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:42.449 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.58 2026-03-25T15:29:42.509 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:42.512 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:42.512 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.59 2026-03-25T15:29:42.575 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:42.573+0000 7fa892674640 0 -- 192.168.123.104:0/1801324630 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x7fa870009c00 msgr2=0x7fa870042500 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:42.579 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:42.583 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:42.583 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.60 2026-03-25T15:29:42.663 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:42.661+0000 7f886930c640 0 -- 192.168.123.104:0/1298854889 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x5638f3f2e030 msgr2=0x5638f3f1e370 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-25T15:29:42.665 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:42.668 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:42.668 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.61 2026-03-25T15:29:42.753 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:42.751+0000 7f76fc3da640 0 -- 192.168.123.104:0/3619854793 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x563317de6ca0 msgr2=0x563317dc24c0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:42.754 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:42.758 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:42.758 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.62 2026-03-25T15:29:42.826 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:42.829 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:42.829 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.63 2026-03-25T15:29:42.894 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:42.892+0000 7f5b8ae3c640 0 -- 192.168.123.104:0/2892507461 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x5575f8cc8cc0 msgr2=0x5575f8ca44f0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:42.894 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:42.897 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:42.898 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.64 2026-03-25T15:29:42.961 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:42.960+0000 7f1951890640 0 -- 192.168.123.104:0/1334237173 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x55ff2a08a030 msgr2=0x55ff2a07a370 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:42.963 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:42.966 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:42.966 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.65 2026-03-25T15:29:43.026 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:43.029 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:43.029 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.66 2026-03-25T15:29:43.089 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:43.087+0000 7f2062dfe640 0 -- 192.168.123.104:0/3891349994 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x7f203c0131d0 msgr2=0x7f203c013640 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-25T15:29:43.092 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:43.095 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:43.095 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.67 2026-03-25T15:29:43.155 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:43.153+0000 7f2f7cb60640 0 -- 192.168.123.104:0/3496869523 >> [v2:192.168.123.104:6816/2768043570,v1:192.168.123.104:6817/2768043570] conn(0x5557fc3141b0 msgr2=0x5557fc334630 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:43.157 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:43.160 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:43.160 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.68 2026-03-25T15:29:43.221 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:43.220+0000 7fed91e9a640 0 -- 192.168.123.104:0/4162749078 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x558c1abd7030 msgr2=0x558c1abc7370 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:43.223 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:43.226 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:43.226 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.69 2026-03-25T15:29:43.294 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:43.297 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:43.297 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.70 2026-03-25T15:29:43.367 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:43.365+0000 7f3f3ae2a640 0 -- 192.168.123.104:0/2318328859 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x5637805e9cc0 msgr2=0x5637805c54a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:43.368 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:43.372 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:43.372 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.71 2026-03-25T15:29:43.446 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:43.444+0000 7f14ddcef640 0 -- 192.168.123.104:0/843709591 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x560b51114cc0 msgr2=0x560b511661c0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:43.450 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:43.454 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:43.454 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.72 2026-03-25T15:29:43.523 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:43.521+0000 7fe53cfd2640 0 -- 192.168.123.104:0/2195943944 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x55645c86f9b0 msgr2=0x55645c973880 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:43.529 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:43.533 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:43.533 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.73 2026-03-25T15:29:43.625 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:43.624+0000 7fe4cffff640 0 -- 192.168.123.104:0/683542475 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x7fe4ac008d20 msgr2=0x7fe4ac0291a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:43.632 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:43.635 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:43.636 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.74 2026-03-25T15:29:43.726 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:43.729 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:43.729 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.75 2026-03-25T15:29:43.797 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:43.795+0000 7f457db02640 0 -- 192.168.123.104:0/3085077151 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x556c97c64030 msgr2=0x556c97c54370 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-25T15:29:43.798 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:43.802 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:43.802 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.76 2026-03-25T15:29:43.868 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:43.871 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:43.871 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.77 2026-03-25T15:29:43.933 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:43.931+0000 7fc4ba904640 0 -- 192.168.123.104:0/3194183725 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x561399069030 msgr2=0x561399059370 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-25T15:29:43.937 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:43.940 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:43.940 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.78 2026-03-25T15:29:44.003 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:44.001+0000 7fec0a1de640 0 -- 192.168.123.104:0/1672275298 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x564244320030 msgr2=0x564244310370 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:44.007 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:44.010 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:44.010 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.79 2026-03-25T15:29:44.084 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:44.087 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:44.087 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.80 2026-03-25T15:29:44.155 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:44.153+0000 7fb5462e4640 0 -- 192.168.123.104:0/2510290593 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x55bc9ec6dcc0 msgr2=0x55bc9ec49310 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:44.159 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:44.162 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:44.162 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.81 2026-03-25T15:29:44.242 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:44.240+0000 7f65a8c11640 0 -- 192.168.123.104:0/4244471108 >> [v2:192.168.123.104:6816/2768043570,v1:192.168.123.104:6817/2768043570] conn(0x5610a8bd5180 msgr2=0x5610a8bf5600 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:44.244 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:44.248 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:44.248 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.82 2026-03-25T15:29:44.317 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:44.315+0000 7fc4fbc5d640 0 -- 192.168.123.104:0/1978394853 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x564b46433cc0 msgr2=0x564b4640f4f0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:44.319 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:44.322 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:44.322 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.83 2026-03-25T15:29:44.390 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:44.393 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:44.393 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.84 2026-03-25T15:29:44.467 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:44.465+0000 7f8777dbf640 0 -- 192.168.123.104:0/1293921245 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x5629d7a62030 msgr2=0x5629d7a52370 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:44.467 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:44.471 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:44.471 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.85 2026-03-25T15:29:44.541 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:44.539+0000 7ff89e5be640 0 -- 192.168.123.104:0/246389744 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x7ff87c006cb0 msgr2=0x7ff87c027140 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:44.544 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:44.547 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:44.548 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.86 2026-03-25T15:29:44.622 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:44.625 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:44.625 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.87 2026-03-25T15:29:44.700 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:44.698+0000 7f38e6456640 0 -- 192.168.123.104:0/2120755315 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x7f38c0003850 msgr2=0x7f38c00090f0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:44.704 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:44.708 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:44.709 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.88 2026-03-25T15:29:44.770 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:44.768+0000 7fd3b5b52640 0 -- 192.168.123.104:0/1877716648 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x55be8637fcc0 msgr2=0x55be863d11c0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:44.774 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:44.778 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:44.778 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.89 2026-03-25T15:29:44.861 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:44.859+0000 7f9f72f49640 0 -- 192.168.123.104:0/688460737 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x555ae34f0030 msgr2=0x555ae34e0370 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:44.865 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:44.868 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:44.868 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.90 2026-03-25T15:29:44.944 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:44.942+0000 7fe2e9538640 0 -- 192.168.123.104:0/564454651 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x5600c37cb030 msgr2=0x5600c37bb370 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:44.948 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:44.951 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:44.951 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.91 2026-03-25T15:29:45.047 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:45.046+0000 7fbfc804c640 0 -- 192.168.123.104:0/2320880798 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7fbfa805ef70 msgr2=0x7fbfa807f350 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:45.050 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:45.053 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:45.053 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.92 2026-03-25T15:29:45.118 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:45.116+0000 7feecd145640 0 -- 192.168.123.104:0/2418310701 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x55d6333d7cc0 msgr2=0x55d633429390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:45.124 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:45.127 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:45.127 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.93 2026-03-25T15:29:45.198 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:45.196+0000 7f1bfb807640 0 -- 192.168.123.104:0/1121631012 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x5569cbeefcc0 msgr2=0x5569cbf41390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:45.198 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:45.202 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:45.202 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.94 2026-03-25T15:29:45.268 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:45.266+0000 7f30078b2640 0 -- 192.168.123.104:0/3165736909 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x7f2fe40011e0 msgr2=0x7f2fe40271b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:45.272 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:45.275 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:45.275 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.95 2026-03-25T15:29:45.353 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:45.356 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:45.356 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.96 2026-03-25T15:29:45.422 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:45.421+0000 7f3dbba57640 0 -- 192.168.123.104:0/2830986649 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x55a1c5b93cc0 msgr2=0x55a1c5be56d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:45.428 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:45.431 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:45.431 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.97 2026-03-25T15:29:45.503 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:45.501+0000 7f84c5d6c640 0 -- 192.168.123.104:0/1889450082 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x7f84a0008d20 msgr2=0x7f84a00291a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-25T15:29:45.508 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:45.512 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:45.512 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.98 2026-03-25T15:29:45.586 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:45.584+0000 7fedd033a640 0 -- 192.168.123.104:0/1710694416 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x55f4751867e0 msgr2=0x55f47528a880 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:45.590 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:45.594 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in $(seq -w 00 99) 2026-03-25T15:29:45.594 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm image.99 2026-03-25T15:29:45.675 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:45.679 INFO:tasks.workunit.client.0.vm04.stderr:+ test_remove 2026-03-25T15:29:45.679 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing remove...' 2026-03-25T15:29:45.679 INFO:tasks.workunit.client.0.vm04.stdout:testing remove... 2026-03-25T15:29:45.679 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-25T15:29:45.679 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:45.758 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:45.831 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:45.902 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:45.977 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:46.054 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:46.122 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:46.196 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:46.265 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:46.334 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:46.406 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:46.485 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:46.558 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:46.640 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:46.809 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:46.891 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:46.972 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:47.111 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:47.179 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd remove NOT_EXIST 2026-03-25T15:29:47.220 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 0% complete...failed. 2026-03-25T15:29:47.220 INFO:tasks.workunit.client.0.vm04.stderr:rbd: delete error: (2) No such file or directory 2026-03-25T15:29:47.223 INFO:tasks.workunit.client.0.vm04.stderr:+ true 2026-03-25T15:29:47.223 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 1 -s 1 test1 2026-03-25T15:29:47.245 INFO:tasks.workunit.client.0.vm04.stderr:rbd: image format 1 is deprecated 2026-03-25T15:29:47.254 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:47.252+0000 7f5d847a4300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:47.261 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test1 2026-03-25T15:29:47.296 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:47.300 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-25T15:29:47.300 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:29:47.300 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^0$' 2026-03-25T15:29:47.333 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-25T15:29:47.333 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 -s 1 test2 2026-03-25T15:29:47.374 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test2 2026-03-25T15:29:47.444 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:47.442+0000 7fe23ddca640 0 -- 192.168.123.104:0/2390679988 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x55b6471b8cc0 msgr2=0x55b6471947a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:47.449 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:47.452 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:29:47.453 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-25T15:29:47.453 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^0$' 2026-03-25T15:29:47.486 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-25T15:29:47.486 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 1 -s 1 test1 2026-03-25T15:29:47.508 INFO:tasks.workunit.client.0.vm04.stderr:rbd: image format 1 is deprecated 2026-03-25T15:29:47.515 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:47.513+0000 7f4616ca0300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:47.523 INFO:tasks.workunit.client.0.vm04.stderr:+ rados rm -p rbd test1.rbd 2026-03-25T15:29:47.550 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test1 2026-03-25T15:29:47.579 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:47.582 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-25T15:29:47.582 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:29:47.583 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^0$' 2026-03-25T15:29:47.608 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-25T15:29:47.608 INFO:tasks.workunit.client.0.vm04.stderr:+ '[' 0 -eq 0 ']' 2026-03-25T15:29:47.608 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 -s 1 test2 2026-03-25T15:29:47.662 INFO:tasks.workunit.client.0.vm04.stderr:++ rados -p rbd ls 2026-03-25T15:29:47.662 INFO:tasks.workunit.client.0.vm04.stderr:++ grep '^rbd_header' 2026-03-25T15:29:47.693 INFO:tasks.workunit.client.0.vm04.stderr:+ HEADER=rbd_header.188517dd8608 2026-03-25T15:29:47.693 INFO:tasks.workunit.client.0.vm04.stderr:+ rados -p rbd rm rbd_header.188517dd8608 2026-03-25T15:29:47.721 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test2 2026-03-25T15:29:47.755 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:47.753+0000 7f376bfff640 -1 librbd::image::OpenRequest: failed to retrieve initial metadata: (2) No such file or directory 2026-03-25T15:29:47.757 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:47.755+0000 7f376bfff640 -1 librbd::image::OpenRequest: failed to retrieve initial metadata: (2) No such file or directory 2026-03-25T15:29:47.768 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:47.766+0000 7f3778efb640 -1 librbd::image::OpenRequest: failed to retrieve initial metadata: (2) No such file or directory 2026-03-25T15:29:47.777 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:47.781 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-25T15:29:47.782 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:29:47.782 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^0$' 2026-03-25T15:29:47.812 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-25T15:29:47.812 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 -s 1 test2 2026-03-25T15:29:47.856 INFO:tasks.workunit.client.0.vm04.stderr:+ rados -p rbd rm rbd_id.test2 2026-03-25T15:29:47.884 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test2 2026-03-25T15:29:47.959 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:47.957+0000 7f9381a66640 0 -- 192.168.123.104:0/1629492248 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x558bf6235030 msgr2=0x558bf6225370 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:47.961 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:47.965 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-25T15:29:47.965 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:29:47.965 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^0$' 2026-03-25T15:29:47.992 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-25T15:29:47.993 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 -s 1 test2 2026-03-25T15:29:48.032 INFO:tasks.workunit.client.0.vm04.stderr:++ rados -p rbd ls 2026-03-25T15:29:48.032 INFO:tasks.workunit.client.0.vm04.stderr:++ grep '^rbd_header' 2026-03-25T15:29:48.259 INFO:tasks.workunit.client.0.vm04.stderr:+ HEADER=rbd_header.18a383006b87 2026-03-25T15:29:48.259 INFO:tasks.workunit.client.0.vm04.stderr:+ rados -p rbd rm rbd_header.18a383006b87 2026-03-25T15:29:48.286 INFO:tasks.workunit.client.0.vm04.stderr:+ rados -p rbd rm rbd_id.test2 2026-03-25T15:29:48.311 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test2 2026-03-25T15:29:48.337 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:48.336+0000 7f291f7fe640 -1 librbd::image::OpenRequest: failed to retrieve initial metadata: (2) No such file or directory 2026-03-25T15:29:48.338 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:48.337+0000 7f291f7fe640 -1 librbd::image::OpenRequest: failed to retrieve initial metadata: (2) No such file or directory 2026-03-25T15:29:48.346 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:48.345+0000 7f291f7fe640 -1 librbd::image::OpenRequest: failed to retrieve initial metadata: (2) No such file or directory 2026-03-25T15:29:48.353 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:48.357 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-25T15:29:48.357 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:29:48.357 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^0$' 2026-03-25T15:29:48.385 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-25T15:29:48.386 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 -s 1 test2 2026-03-25T15:29:48.425 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create test2@snap 2026-03-25T15:29:48.968 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-25T15:29:48.978 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap protect test2@snap 2026-03-25T15:29:49.022 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone test2@snap clone --rbd-default-clone-format 1 2026-03-25T15:29:49.075 INFO:tasks.workunit.client.0.vm04.stderr:+ rados -p rbd rm rbd_children 2026-03-25T15:29:49.101 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm clone 2026-03-25T15:29:49.177 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:49.175+0000 7f839193d640 0 -- 192.168.123.104:0/3878323154 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x55c3b7fdf030 msgr2=0x55c3b7fcf370 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:49.177 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:49.181 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-25T15:29:49.181 INFO:tasks.workunit.client.0.vm04.stderr:+ grep clone 2026-03-25T15:29:49.181 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^0$' 2026-03-25T15:29:49.181 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:29:49.207 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-25T15:29:49.207 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap unprotect test2@snap 2026-03-25T15:29:49.246 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm test2@snap 2026-03-25T15:29:49.969 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-25T15:29:49.978 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test2 2026-03-25T15:29:50.180 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:50.179+0000 7fd50ba07640 0 -- 192.168.123.104:0/1110555625 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7fd4e805eff0 msgr2=0x7fd4e807f3f0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:50.201 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:29:50.209 INFO:tasks.workunit.client.0.vm04.stderr:+ test_migration 2026-03-25T15:29:50.209 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing migration...' 2026-03-25T15:29:50.209 INFO:tasks.workunit.client.0.vm04.stdout:testing migration... 2026-03-25T15:29:50.209 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-25T15:29:50.209 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:50.319 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:50.387 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:50.533 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:50.662 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:50.813 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:50.935 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:51.074 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:51.205 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:51.388 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:51.516 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:51.643 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:51.831 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:51.934 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:52.073 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:52.238 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:52.375 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:52.509 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:29:52.662 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool create rbd2 8 2026-03-25T15:29:53.180 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd2' already exists 2026-03-25T15:29:53.192 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd pool init rbd2 2026-03-25T15:29:56.147 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 1 -s 128M test1 2026-03-25T15:29:56.172 INFO:tasks.workunit.client.0.vm04.stderr:rbd: image format 1 is deprecated 2026-03-25T15:29:56.179 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:56.178+0000 7fc2719b1300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:56.188 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info test1 2026-03-25T15:29:56.188 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'format: 1' 2026-03-25T15:29:56.227 INFO:tasks.workunit.client.0.vm04.stdout: format: 1 2026-03-25T15:29:56.227 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration prepare test1 --image-format 2 2026-03-25T15:29:56.297 INFO:tasks.workunit.client.0.vm04.stderr:++ get_migration_state test1 2026-03-25T15:29:56.297 INFO:tasks.workunit.client.0.vm04.stderr:++ local image=test1 2026-03-25T15:29:56.297 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd --format xml status test1 2026-03-25T15:29:56.297 INFO:tasks.workunit.client.0.vm04.stderr:++ xmlstarlet sel -t -v //status/migration/state 2026-03-25T15:29:56.365 INFO:tasks.workunit.client.0.vm04.stderr:+ test prepared = prepared 2026-03-25T15:29:56.366 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info test1 2026-03-25T15:29:56.366 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'format: 2' 2026-03-25T15:29:56.410 INFO:tasks.workunit.client.0.vm04.stdout: format: 2 2026-03-25T15:29:56.410 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test1 2026-03-25T15:29:56.447 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:56.446+0000 7fa259c49300 -1 librbd::image::PreRemoveRequest: 0x55a37cf25610 validate_image_removal: image in migration state - not removing 2026-03-25T15:29:56.449 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 0% complete...failed. 2026-03-25T15:29:56.449 INFO:tasks.workunit.client.0.vm04.stderr:rbd: error: image still has watchers 2026-03-25T15:29:56.450 INFO:tasks.workunit.client.0.vm04.stderr:This means the image is still open or the client using it crashed. Try again after closing/unmapping it or waiting 30s for the crashed client to timeout. 2026-03-25T15:29:56.454 INFO:tasks.workunit.client.0.vm04.stderr:+ true 2026-03-25T15:29:56.454 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration execute test1 2026-03-25T15:29:56.529 INFO:tasks.workunit.client.0.vm04.stderr: Image migration: 3% complete... Image migration: 6% complete... Image migration: 9% complete... Image migration: 12% complete... Image migration: 15% complete... Image migration: 18% complete... Image migration: 21% complete... Image migration: 25% complete... Image migration: 28% complete... Image migration: 31% complete... Image migration: 34% complete... Image migration: 37% complete... Image migration: 40% complete... Image migration: 43% complete... Image migration: 46% complete... Image migration: 50% complete... Image migration: 53% complete... Image migration: 56% complete... Image migration: 59% complete... Image migration: 62% complete... Image migration: 65% complete... Image migration: 68% complete... Image migration: 71% complete... Image migration: 75% complete... Image migration: 78% complete... Image migration: 81% complete... Image migration: 84% complete... Image migration: 87% complete...2026-03-25T15:29:56.527+0000 7fdaed221640 0 -- 192.168.123.104:0/4170185959 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x556822c3f110 msgr2=0x556822ccdcb0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:56.537 INFO:tasks.workunit.client.0.vm04.stderr: Image migration: 90% complete... Image migration: 93% complete... Image migration: 96% complete...2026-03-25T15:29:56.535+0000 7fdaebf98640 0 -- 192.168.123.104:0/4170185959 >> [v2:192.168.123.104:6816/2768043570,v1:192.168.123.104:6817/2768043570] conn(0x7fdac4008d20 msgr2=0x7fdac40291a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:56.545 INFO:tasks.workunit.client.0.vm04.stderr: Image migration: 100% complete...done. 2026-03-25T15:29:56.549 INFO:tasks.workunit.client.0.vm04.stderr:++ get_migration_state test1 2026-03-25T15:29:56.550 INFO:tasks.workunit.client.0.vm04.stderr:++ local image=test1 2026-03-25T15:29:56.550 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd --format xml status test1 2026-03-25T15:29:56.550 INFO:tasks.workunit.client.0.vm04.stderr:++ xmlstarlet sel -t -v //status/migration/state 2026-03-25T15:29:56.607 INFO:tasks.workunit.client.0.vm04.stderr:+ test executed = executed 2026-03-25T15:29:56.607 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration commit test1 2026-03-25T15:29:56.684 INFO:tasks.workunit.client.0.vm04.stderr: Commit image migration: 3% complete... Commit image migration: 6% complete... Commit image migration: 9% complete... Commit image migration: 12% complete... Commit image migration: 15% complete... Commit image migration: 18% complete... Commit image migration: 21% complete... Commit image migration: 25% complete... Commit image migration: 28% complete...2026-03-25T15:29:56.682+0000 7f425ffff640 0 -- 192.168.123.104:0/1186630645 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x7f4240012b60 msgr2=0x7f4240007810 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:56.891 INFO:tasks.workunit.client.0.vm04.stderr: Commit image migration: 31% complete... Commit image migration: 34% complete... Commit image migration: 37% complete... Commit image migration: 40% complete... Commit image migration: 43% complete... Commit image migration: 46% complete... Commit image migration: 50% complete... Commit image migration: 53% complete... Commit image migration: 56% complete... Commit image migration: 59% complete... Commit image migration: 62% complete... Commit image migration: 65% complete... Commit image migration: 68% complete... Commit image migration: 71% complete... Commit image migration: 75% complete... Commit image migration: 78% complete... Commit image migration: 81% complete... Commit image migration: 84% complete... Commit image migration: 87% complete... Commit image migration: 90% complete... Commit image migration: 93% complete... Commit image migration: 96% complete...2026-03-25T15:29:56.889+0000 7f425ffff640 0 -- 192.168.123.104:0/1186630645 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x7f4240012b60 msgr2=0x7f4240007810 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-25T15:29:56.901 INFO:tasks.workunit.client.0.vm04.stderr: Commit image migration: 100% complete...done. 2026-03-25T15:29:56.905 INFO:tasks.workunit.client.0.vm04.stderr:+ get_migration_state test1 2026-03-25T15:29:56.905 INFO:tasks.workunit.client.0.vm04.stderr:+ local image=test1 2026-03-25T15:29:56.905 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd --format xml status test1 2026-03-25T15:29:56.905 INFO:tasks.workunit.client.0.vm04.stderr:+ xmlstarlet sel -t -v //status/migration/state 2026-03-25T15:29:56.939 INFO:tasks.workunit.client.0.vm04.stderr:+ true 2026-03-25T15:29:56.939 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info test1 2026-03-25T15:29:56.939 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'features: .*layering' 2026-03-25T15:29:56.975 INFO:tasks.workunit.client.0.vm04.stderr:+ true 2026-03-25T15:29:56.975 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration prepare test1 --image-feature layering,exclusive-lock,object-map,fast-diff,deep-flatten 2026-03-25T15:29:57.272 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info test1 2026-03-25T15:29:57.272 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'features: .*layering' 2026-03-25T15:29:57.321 INFO:tasks.workunit.client.0.vm04.stdout: features: layering, exclusive-lock, object-map, fast-diff, deep-flatten, migrating 2026-03-25T15:29:57.321 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration execute test1 2026-03-25T15:29:57.413 INFO:tasks.workunit.client.0.vm04.stderr: Image migration: 3% complete... Image migration: 6% complete... Image migration: 9% complete... Image migration: 12% complete... Image migration: 15% complete... Image migration: 18% complete... Image migration: 21% complete... Image migration: 25% complete... Image migration: 28% complete... Image migration: 31% complete... Image migration: 34% complete... Image migration: 37% complete... Image migration: 40% complete... Image migration: 43% complete... Image migration: 46% complete... Image migration: 50% complete... Image migration: 53% complete... Image migration: 56% complete... Image migration: 59% complete... Image migration: 62% complete... Image migration: 65% complete... Image migration: 68% complete... Image migration: 71% complete... Image migration: 75% complete... Image migration: 78% complete... Image migration: 81% complete... Image migration: 84% complete... Image migration: 87% complete... Image migration: 90% complete... Image migration: 93% complete... Image migration: 96% complete...2026-03-25T15:29:57.411+0000 7f0f19d62640 0 -- 192.168.123.104:0/127463849 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x7f0efc008d60 msgr2=0x7f0efc0291e0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-25T15:29:57.434 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:57.432+0000 7f0f19561640 0 -- 192.168.123.104:0/127463849 >> [v2:192.168.123.104:6816/2768043570,v1:192.168.123.104:6817/2768043570] conn(0x7f0ef0004a30 msgr2=0x7f0ef0025430 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:57.435 INFO:tasks.workunit.client.0.vm04.stderr: Image migration: 100% complete...done. 2026-03-25T15:29:57.440 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration commit test1 2026-03-25T15:29:57.525 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:57.523+0000 7f0ac37af640 0 -- 192.168.123.104:0/2576660319 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x7f0a9c008d20 msgr2=0x7f0a9c0291a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:57.534 INFO:tasks.workunit.client.0.vm04.stderr: Commit image migration: 3% complete... Commit image migration: 6% complete... Commit image migration: 9% complete... Commit image migration: 12% complete... Commit image migration: 15% complete... Commit image migration: 18% complete... Commit image migration: 21% complete... Commit image migration: 25% complete... Commit image migration: 28% complete... Commit image migration: 31% complete...2026-03-25T15:29:57.533+0000 7f0ac4a38640 0 -- 192.168.123.104:0/2576660319 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x56495ae47210 msgr2=0x56495aed70c0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:57.554 INFO:tasks.workunit.client.0.vm04.stderr: Commit image migration: 34% complete... Commit image migration: 37% complete... Commit image migration: 40% complete... Commit image migration: 43% complete... Commit image migration: 46% complete... Commit image migration: 50% complete... Commit image migration: 53% complete... Commit image migration: 56% complete... Commit image migration: 59% complete... Commit image migration: 62% complete... Commit image migration: 65% complete... Commit image migration: 68% complete... Commit image migration: 71% complete... Commit image migration: 75% complete... Commit image migration: 78% complete... Commit image migration: 81% complete... Commit image migration: 84% complete... Commit image migration: 87% complete... Commit image migration: 90% complete... Commit image migration: 93% complete... Commit image migration: 96% complete... Commit image migration: 100% complete...done. 2026-03-25T15:29:57.558 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration prepare test1 rbd2/test1 2026-03-25T15:29:57.635 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:57.633+0000 7f5d7a7af640 0 -- 192.168.123.104:0/1857424630 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x7f5d5c008d60 msgr2=0x7f5d5c0291e0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:57.643 INFO:tasks.workunit.client.0.vm04.stderr:++ get_migration_state rbd2/test1 2026-03-25T15:29:57.643 INFO:tasks.workunit.client.0.vm04.stderr:++ local image=rbd2/test1 2026-03-25T15:29:57.643 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd --format xml status rbd2/test1 2026-03-25T15:29:57.643 INFO:tasks.workunit.client.0.vm04.stderr:++ xmlstarlet sel -t -v //status/migration/state 2026-03-25T15:29:57.705 INFO:tasks.workunit.client.0.vm04.stderr:+ test prepared = prepared 2026-03-25T15:29:57.705 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-25T15:29:57.705 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:29:57.705 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^0$' 2026-03-25T15:29:57.732 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-25T15:29:57.732 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd -p rbd2 ls 2026-03-25T15:29:57.733 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test1 2026-03-25T15:29:57.761 INFO:tasks.workunit.client.0.vm04.stdout:test1 2026-03-25T15:29:57.762 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration execute test1 2026-03-25T15:29:57.821 INFO:tasks.workunit.client.0.vm04.stderr: Image migration: 3% complete... Image migration: 6% complete... Image migration: 9% complete... Image migration: 12% complete... Image migration: 15% complete... Image migration: 18% complete... Image migration: 21% complete... Image migration: 25% complete... Image migration: 28% complete... Image migration: 31% complete... Image migration: 34% complete... Image migration: 37% complete... Image migration: 40% complete... Image migration: 43% complete... Image migration: 46% complete... Image migration: 50% complete... Image migration: 53% complete... Image migration: 56% complete... Image migration: 59% complete... Image migration: 62% complete... Image migration: 65% complete... Image migration: 68% complete... Image migration: 71% complete... Image migration: 75% complete... Image migration: 78% complete... Image migration: 81% complete... Image migration: 84% complete... Image migration: 87% complete... Image migration: 90% complete... Image migration: 93% complete... Image migration: 96% complete...2026-03-25T15:29:57.820+0000 7f0967fff640 0 -- 192.168.123.104:0/3111169115 >> [v2:192.168.123.104:6816/2768043570,v1:192.168.123.104:6817/2768043570] conn(0x7f0944008150 msgr2=0x7f0944028530 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:57.826 INFO:tasks.workunit.client.0.vm04.stderr: Image migration: 100% complete...done. 2026-03-25T15:29:57.830 INFO:tasks.workunit.client.0.vm04.stderr:++ get_migration_state rbd2/test1 2026-03-25T15:29:57.830 INFO:tasks.workunit.client.0.vm04.stderr:++ local image=rbd2/test1 2026-03-25T15:29:57.831 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd --format xml status rbd2/test1 2026-03-25T15:29:57.831 INFO:tasks.workunit.client.0.vm04.stderr:++ xmlstarlet sel -t -v //status/migration/state 2026-03-25T15:29:57.893 INFO:tasks.workunit.client.0.vm04.stderr:+ test executed = executed 2026-03-25T15:29:57.893 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm rbd2/test1 2026-03-25T15:29:57.930 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:57.928+0000 7fdfddec0640 -1 librbd::image::PreRemoveRequest: 0x5560cb89e140 validate_image_removal: image in migration state - not removing 2026-03-25T15:29:57.933 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 0% complete...failed. 2026-03-25T15:29:57.933 INFO:tasks.workunit.client.0.vm04.stderr:rbd: error: image still has watchers 2026-03-25T15:29:57.933 INFO:tasks.workunit.client.0.vm04.stderr:This means the image is still open or the client using it crashed. Try again after closing/unmapping it or waiting 30s for the crashed client to timeout. 2026-03-25T15:29:57.937 INFO:tasks.workunit.client.0.vm04.stderr:+ true 2026-03-25T15:29:57.937 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration commit test1 2026-03-25T15:29:57.999 INFO:tasks.workunit.client.0.vm04.stderr: Commit image migration: 3% complete... Commit image migration: 6% complete... Commit image migration: 9% complete... Commit image migration: 12% complete... Commit image migration: 15% complete... Commit image migration: 18% complete... Commit image migration: 21% complete... Commit image migration: 25% complete... Commit image migration: 28% complete... Commit image migration: 31% complete... Commit image migration: 34% complete... Commit image migration: 37% complete... Commit image migration: 40% complete... Commit image migration: 43% complete... Commit image migration: 46% complete... Commit image migration: 50% complete... Commit image migration: 53% complete... Commit image migration: 56% complete... Commit image migration: 59% complete... Commit image migration: 62% complete... Commit image migration: 65% complete... Commit image migration: 68% complete... Commit image migration: 71% complete... Commit image migration: 75% complete... Commit image migration: 78% complete... Commit image migration: 81% complete... Commit image migration: 84% complete... Commit image migration: 87% complete... Commit image migration: 90% complete... Commit image migration: 93% complete... Commit image migration: 96% complete...2026-03-25T15:29:57.998+0000 7f2092838640 0 -- 192.168.123.104:0/3084705274 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x7f2070007420 msgr2=0x7f20700278a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-25T15:29:58.009 INFO:tasks.workunit.client.0.vm04.stderr: Commit image migration: 100% complete...done. 2026-03-25T15:29:58.013 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd namespace create rbd2/ns1 2026-03-25T15:29:58.047 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd namespace create rbd2/ns2 2026-03-25T15:29:58.082 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration prepare rbd2/test1 rbd2/ns1/test1 2026-03-25T15:29:58.156 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:58.154+0000 7f90f05eb640 0 -- 192.168.123.104:0/2815793582 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x55a15d21b1f0 msgr2=0x55a15d1ebfb0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:58.164 INFO:tasks.workunit.client.0.vm04.stderr:++ get_migration_state rbd2/ns1/test1 2026-03-25T15:29:58.164 INFO:tasks.workunit.client.0.vm04.stderr:++ local image=rbd2/ns1/test1 2026-03-25T15:29:58.164 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd --format xml status rbd2/ns1/test1 2026-03-25T15:29:58.164 INFO:tasks.workunit.client.0.vm04.stderr:++ xmlstarlet sel -t -v //status/migration/state 2026-03-25T15:29:58.229 INFO:tasks.workunit.client.0.vm04.stderr:+ test prepared = prepared 2026-03-25T15:29:58.229 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration execute rbd2/test1 2026-03-25T15:29:58.290 INFO:tasks.workunit.client.0.vm04.stderr: Image migration: 3% complete... Image migration: 6% complete... Image migration: 9% complete... Image migration: 12% complete... Image migration: 15% complete... Image migration: 18% complete... Image migration: 21% complete... Image migration: 25% complete... Image migration: 28% complete... Image migration: 31% complete... Image migration: 34% complete... Image migration: 37% complete... Image migration: 40% complete... Image migration: 43% complete... Image migration: 46% complete... Image migration: 50% complete... Image migration: 53% complete... Image migration: 56% complete... Image migration: 59% complete... Image migration: 62% complete... Image migration: 65% complete... Image migration: 68% complete... Image migration: 71% complete... Image migration: 75% complete... Image migration: 78% complete... Image migration: 81% complete... Image migration: 84% complete... Image migration: 87% complete... Image migration: 90% complete... Image migration: 93% complete... Image migration: 96% complete...2026-03-25T15:29:58.288+0000 7f0eda34f640 0 -- 192.168.123.104:0/162264667 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x7f0eb0008090 msgr2=0x7f0eb0028510 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:58.298 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:58.297+0000 7f0eda34f640 0 -- 192.168.123.104:0/162264667 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x7f0eb805f650 msgr2=0x7f0eb807fa50 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:58.299 INFO:tasks.workunit.client.0.vm04.stderr: Image migration: 100% complete...done. 2026-03-25T15:29:58.303 INFO:tasks.workunit.client.0.vm04.stderr:++ get_migration_state rbd2/ns1/test1 2026-03-25T15:29:58.303 INFO:tasks.workunit.client.0.vm04.stderr:++ local image=rbd2/ns1/test1 2026-03-25T15:29:58.303 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd --format xml status rbd2/ns1/test1 2026-03-25T15:29:58.303 INFO:tasks.workunit.client.0.vm04.stderr:++ xmlstarlet sel -t -v //status/migration/state 2026-03-25T15:29:58.396 INFO:tasks.workunit.client.0.vm04.stderr:+ test executed = executed 2026-03-25T15:29:58.396 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration commit rbd2/test1 2026-03-25T15:29:58.469 INFO:tasks.workunit.client.0.vm04.stderr: Commit image migration: 3% complete... Commit image migration: 6% complete... Commit image migration: 9% complete... Commit image migration: 12% complete... Commit image migration: 15% complete... Commit image migration: 18% complete... Commit image migration: 21% complete... Commit image migration: 25% complete... Commit image migration: 28% complete... Commit image migration: 31% complete... Commit image migration: 34% complete... Commit image migration: 37% complete... Commit image migration: 40% complete... Commit image migration: 43% complete... Commit image migration: 46% complete... Commit image migration: 50% complete... Commit image migration: 53% complete... Commit image migration: 56% complete... Commit image migration: 59% complete... Commit image migration: 62% complete... Commit image migration: 65% complete... Commit image migration: 68% complete... Commit image migration: 71% complete... Commit image migration: 75% complete... Commit image migration: 78% complete... Commit image migration: 81% complete... Commit image migration: 84% complete... Commit image migration: 87% complete... Commit image migration: 90% complete... Commit image migration: 93% complete... Commit image migration: 96% complete...2026-03-25T15:29:58.467+0000 7f6577fff640 0 -- 192.168.123.104:0/132770552 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x7f65580080a0 msgr2=0x7f6558028520 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-25T15:29:58.477 INFO:tasks.workunit.client.0.vm04.stderr: Commit image migration: 100% complete...done. 2026-03-25T15:29:58.481 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration prepare rbd2/ns1/test1 rbd2/ns2/test1 2026-03-25T15:29:58.562 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:58.560+0000 7f4eaa666640 0 -- 192.168.123.104:0/91005242 >> [v2:192.168.123.104:6816/2768043570,v1:192.168.123.104:6817/2768043570] conn(0x5653f0caa8e0 msgr2=0x5653f0ccacc0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:58.571 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration execute rbd2/ns2/test1 2026-03-25T15:29:58.630 INFO:tasks.workunit.client.0.vm04.stderr: Image migration: 3% complete... Image migration: 6% complete... Image migration: 9% complete... Image migration: 12% complete... Image migration: 15% complete... Image migration: 18% complete... Image migration: 21% complete... Image migration: 25% complete... Image migration: 28% complete... Image migration: 31% complete... Image migration: 34% complete... Image migration: 37% complete... Image migration: 40% complete... Image migration: 43% complete... Image migration: 46% complete... Image migration: 50% complete... Image migration: 53% complete... Image migration: 56% complete... Image migration: 59% complete... Image migration: 62% complete... Image migration: 65% complete... Image migration: 68% complete... Image migration: 71% complete... Image migration: 75% complete... Image migration: 78% complete... Image migration: 81% complete... Image migration: 84% complete... Image migration: 87% complete... Image migration: 90% complete... Image migration: 93% complete... Image migration: 96% complete...2026-03-25T15:29:58.628+0000 7f4f39f69640 0 -- 192.168.123.104:0/1326052879 >> [v2:192.168.123.104:6816/2768043570,v1:192.168.123.104:6817/2768043570] conn(0x55599c351340 msgr2=0x55599c371720 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:58.638 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:58.637+0000 7f4f39f69640 0 -- 192.168.123.104:0/1326052879 >> [v2:192.168.123.104:6816/2768043570,v1:192.168.123.104:6817/2768043570] conn(0x7f4f1805f850 msgr2=0x7f4f1807fc50 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:58.640 INFO:tasks.workunit.client.0.vm04.stderr: Image migration: 100% complete...done. 2026-03-25T15:29:58.644 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration commit rbd2/ns2/test1 2026-03-25T15:29:58.718 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:58.716+0000 7f7bd6b3d640 0 -- 192.168.123.104:0/3292219549 >> [v2:192.168.123.104:6816/2768043570,v1:192.168.123.104:6817/2768043570] conn(0x7f7bc4002340 msgr2=0x55da041e9870 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-25T15:29:58.726 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:58.724+0000 7f7bd7dc6640 0 -- 192.168.123.104:0/3292219549 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x55da04137fb0 msgr2=0x55da0418d2e0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:58.739 INFO:tasks.workunit.client.0.vm04.stderr: Commit image migration: 3% complete... Commit image migration: 6% complete... Commit image migration: 9% complete... Commit image migration: 12% complete... Commit image migration: 15% complete... Commit image migration: 18% complete... Commit image migration: 21% complete... Commit image migration: 25% complete... Commit image migration: 28% complete... Commit image migration: 31% complete... Commit image migration: 34% complete... Commit image migration: 37% complete... Commit image migration: 40% complete... Commit image migration: 43% complete... Commit image migration: 46% complete... Commit image migration: 50% complete... Commit image migration: 53% complete... Commit image migration: 56% complete... Commit image migration: 59% complete... Commit image migration: 62% complete... Commit image migration: 65% complete... Commit image migration: 68% complete... Commit image migration: 71% complete... Commit image migration: 75% complete... Commit image migration: 78% complete... Commit image migration: 81% complete... Commit image migration: 84% complete... Commit image migration: 87% complete... Commit image migration: 90% complete... Commit image migration: 93% complete... Commit image migration: 96% complete... Commit image migration: 100% complete...done. 2026-03-25T15:29:58.744 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create -s 128M test1 2026-03-25T15:29:58.771 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:58.769+0000 7f7da4c66300 -1 librbd: Forced V1 image creation. 2026-03-25T15:29:58.779 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration prepare test1 --data-pool rbd2 2026-03-25T15:29:58.843 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info test1 2026-03-25T15:29:58.843 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'data_pool: rbd2' 2026-03-25T15:29:58.883 INFO:tasks.workunit.client.0.vm04.stdout: data_pool: rbd2 2026-03-25T15:29:58.883 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration execute test1 2026-03-25T15:29:58.952 INFO:tasks.workunit.client.0.vm04.stderr: Image migration: 3% complete... Image migration: 6% complete... Image migration: 9% complete... Image migration: 12% complete... Image migration: 15% complete... Image migration: 18% complete... Image migration: 21% complete... Image migration: 25% complete... Image migration: 28% complete... Image migration: 31% complete... Image migration: 34% complete... Image migration: 37% complete... Image migration: 40% complete... Image migration: 43% complete... Image migration: 46% complete... Image migration: 50% complete... Image migration: 53% complete... Image migration: 56% complete... Image migration: 59% complete... Image migration: 62% complete... Image migration: 65% complete... Image migration: 68% complete... Image migration: 71% complete... Image migration: 75% complete... Image migration: 78% complete... Image migration: 81% complete... Image migration: 84% complete... Image migration: 87% complete... Image migration: 90% complete... Image migration: 93% complete... Image migration: 96% complete...2026-03-25T15:29:58.950+0000 7f2efea1a640 0 -- 192.168.123.104:0/2648100122 >> [v2:192.168.123.104:6816/2768043570,v1:192.168.123.104:6817/2768043570] conn(0x7f2edc004a30 msgr2=0x7f2edc025430 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:58.956 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:58.955+0000 7f2eff21b640 0 -- 192.168.123.104:0/2648100122 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x7f2ed8008d60 msgr2=0x7f2ed80291e0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:58.963 INFO:tasks.workunit.client.0.vm04.stderr: Image migration: 100% complete...done. 2026-03-25T15:29:58.967 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration commit test1 2026-03-25T15:29:59.036 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:59.035+0000 7fa52a770640 0 -- 192.168.123.104:0/1511835129 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x55f029699310 msgr2=0x55f029728420 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:59.048 INFO:tasks.workunit.client.0.vm04.stderr: Commit image migration: 3% complete... Commit image migration: 6% complete... Commit image migration: 9% complete... Commit image migration: 12% complete... Commit image migration: 15% complete... Commit image migration: 18% complete... Commit image migration: 21% complete... Commit image migration: 25% complete... Commit image migration: 28% complete... Commit image migration: 31% complete... Commit image migration: 34% complete... Commit image migration: 37% complete... Commit image migration: 40% complete... Commit image migration: 43% complete... Commit image migration: 46% complete... Commit image migration: 50% complete... Commit image migration: 53% complete... Commit image migration: 56% complete... Commit image migration: 59% complete... Commit image migration: 62% complete... Commit image migration: 65% complete... Commit image migration: 68% complete... Commit image migration: 71% complete... Commit image migration: 75% complete... Commit image migration: 78% complete... Commit image migration: 81% complete...2026-03-25T15:29:59.046+0000 7fa528ce6640 0 -- 192.168.123.104:0/1511835129 >> [v2:192.168.123.104:6816/2768043570,v1:192.168.123.104:6817/2768043570] conn(0x7fa504004a30 msgr2=0x7fa504025430 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:59.057 INFO:tasks.workunit.client.0.vm04.stderr: Commit image migration: 84% complete... Commit image migration: 87% complete... Commit image migration: 90% complete... Commit image migration: 93% complete... Commit image migration: 96% complete... Commit image migration: 100% complete...done. 2026-03-25T15:29:59.061 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration prepare test1 2026-03-25T15:29:59.131 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd trash mv test1 2026-03-25T15:29:59.131 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv test1 2026-03-25T15:29:59.165 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:59.163+0000 7f578da26300 -1 librbd::api::Trash: move: cannot move migrating image to trash 2026-03-25T15:29:59.167 INFO:tasks.workunit.client.0.vm04.stderr:rbd: deferred delete error: (16) Device or resource busy 2026-03-25T15:29:59.170 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:29:59.170 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash ls -a 2026-03-25T15:29:59.171 INFO:tasks.workunit.client.0.vm04.stderr:++ cut -d ' ' -f 1 2026-03-25T15:29:59.204 INFO:tasks.workunit.client.0.vm04.stderr:+ ID=19a561890a69 2026-03-25T15:29:59.204 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd trash rm 19a561890a69 2026-03-25T15:29:59.204 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash rm 19a561890a69 2026-03-25T15:29:59.426 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:59.425+0000 7f5cf8857640 0 --2- 192.168.123.104:0/3117346698 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5ce4002340 0x7f5ce4002730 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 2026-03-25T15:29:59.437 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:59.436+0000 7f5cf2ffd640 -1 librbd::image::RefreshRequest: image being migrated 2026-03-25T15:29:59.438 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:59.436+0000 7f5cf2ffd640 -1 librbd::image::OpenRequest: failed to refresh image: (30) Read-only file system 2026-03-25T15:29:59.438 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:59.436+0000 7f5cf27fc640 -1 librbd::ImageState: 0x7f5cd403c0e0 failed to open image: (30) Read-only file system 2026-03-25T15:29:59.438 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:29:59.436+0000 7f5cdd7fa640 -1 librbd::image::RemoveRequest: 0x7f5cd4000b80 handle_open_image: error opening image: (30) Read-only file system 2026-03-25T15:29:59.439 INFO:tasks.workunit.client.0.vm04.stderr:rbd: remove error: (30) Read-only file system 2026-03-25T15:29:59.439 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 0% complete...failed. 2026-03-25T15:29:59.443 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:29:59.443 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd trash restore 19a561890a69 2026-03-25T15:29:59.443 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash restore 19a561890a69 2026-03-25T15:29:59.470 INFO:tasks.workunit.client.0.vm04.stderr:rbd: restore error: 2026-03-25T15:29:59.469+0000 7fd79d7b7300 -1 librbd::api::Trash: restore: Current trash source 'migration' does not match expected: user,mirroring,unknown (4) 2026-03-25T15:29:59.470 INFO:tasks.workunit.client.0.vm04.stderr:(22) Invalid argument 2026-03-25T15:29:59.473 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:29:59.474 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration abort test1 2026-03-25T15:29:59.536 INFO:tasks.workunit.client.0.vm04.stderr: Abort image migration: 1% complete... Abort image migration: 3% complete... Abort image migration: 4% complete... Abort image migration: 6% complete... Abort image migration: 7% complete... Abort image migration: 9% complete... Abort image migration: 10% complete... Abort image migration: 12% complete... Abort image migration: 14% complete... Abort image migration: 15% complete... Abort image migration: 17% complete... Abort image migration: 18% complete... Abort image migration: 20% complete... Abort image migration: 21% complete... Abort image migration: 23% complete... Abort image migration: 25% complete... Abort image migration: 26% complete... Abort image migration: 28% complete... Abort image migration: 29% complete... Abort image migration: 31% complete... Abort image migration: 32% complete... Abort image migration: 34% complete... Abort image migration: 35% complete... Abort image migration: 37% complete... Abort image migration: 39% complete... Abort image migration: 40% complete... Abort image migration: 42% complete... Abort image migration: 43% complete... Abort image migration: 45% complete... Abort image migration: 46% complete... Abort image migration: 48% complete... Abort image migration: 50% complete... Abort image migration: 51% complete... Abort image migration: 53% complete... Abort image migration: 54% complete... Abort image migration: 56% complete... Abort image migration: 57% complete... Abort image migration: 59% complete... Abort image migration: 60% complete... Abort image migration: 62% complete... Abort image migration: 64% complete... Abort image migration: 65% complete... Abort image migration: 67% complete... Abort image migration: 68% complete... Abort image migration: 70% complete... Abort image migration: 71% complete... Abort image migration: 73% complete... Abort image migration: 75% complete... Abort image migration: 76% complete... Abort image migration: 78% complete... Abort image migration: 79% complete... Abort image migration: 81% complete... Abort image migration: 82% complete... Abort image migration: 84% complete... Abort image migration: 85% complete... Abort image migration: 87% complete... Abort image migration: 89% complete... Abort image migration: 90% complete... 2026-03-25T15:29:59.534+0000 7fa5ec403640 0 -- 192.168.123.104:0/3993945675 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x564442ffd0c0 msgr2=0x564443089ee0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:59.553 INFO:tasks.workunit.client.0.vm04.stderr:Abort image migration: 92% complete... Abort image migration: 93% complete... Abort image migration: 95% complete... Abort image migration: 96% complete... Abort image migration: 98% complete...2026-03-25T15:29:59.551+0000 7fa5ec403640 0 -- 192.168.123.104:0/3993945675 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7fa5cc080830 msgr2=0x7fa5cc0a0c50 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:29:59.564 INFO:tasks.workunit.client.0.vm04.stderr: Abort image migration: 100% complete...done. 2026-03-25T15:29:59.567 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd remove test1 2026-03-25T15:29:59.648 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete... Removing image: 100% complete...done. 2026-03-25T15:29:59.652 INFO:tasks.workunit.client.0.vm04.stderr:+ dd if=/dev/urandom bs=1M count=1 2026-03-25T15:29:59.652 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd --image-format 2 import - test1 2026-03-25T15:29:59.696 INFO:tasks.workunit.client.0.vm04.stderr:1+0 records in 2026-03-25T15:29:59.696 INFO:tasks.workunit.client.0.vm04.stderr:1+0 records out 2026-03-25T15:29:59.697 INFO:tasks.workunit.client.0.vm04.stderr:1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0439382 s, 23.9 MB/s 2026-03-25T15:29:59.722 INFO:tasks.workunit.client.0.vm04.stderr: Importing image: 100% complete...done. 2026-03-25T15:29:59.726 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd export test1 - 2026-03-25T15:29:59.726 INFO:tasks.workunit.client.0.vm04.stderr:++ md5sum 2026-03-25T15:29:59.765 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 100% complete...done. 2026-03-25T15:29:59.768 INFO:tasks.workunit.client.0.vm04.stderr:+ md5sum='0a3f93b5049f28aed4df865d9c3745ba -' 2026-03-25T15:29:59.768 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create test1@snap1 2026-03-25T15:29:59.866 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-25T15:29:59.874 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap protect test1@snap1 2026-03-25T15:29:59.917 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create test1@snap2 2026-03-25T15:30:00.868 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-25T15:30:00.876 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone test1@snap1 clone_v1 --rbd_default_clone_format=1 2026-03-25T15:30:00.933 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone test1@snap2 clone_v2 --rbd_default_clone_format=2 2026-03-25T15:30:00.987 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info clone_v1 2026-03-25T15:30:00.987 INFO:tasks.workunit.client.0.vm04.stderr:+ fgrep 'parent: rbd/test1@snap1' 2026-03-25T15:30:01.026 INFO:tasks.workunit.client.0.vm04.stdout: parent: rbd/test1@snap1 2026-03-25T15:30:01.027 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info clone_v2 2026-03-25T15:30:01.027 INFO:tasks.workunit.client.0.vm04.stderr:+ fgrep 'parent: rbd/test1@snap2' 2026-03-25T15:30:01.064 INFO:tasks.workunit.client.0.vm04.stdout: parent: rbd/test1@snap2 2026-03-25T15:30:01.064 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info clone_v2 2026-03-25T15:30:01.064 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'op_features: clone-child' 2026-03-25T15:30:01.102 INFO:tasks.workunit.client.0.vm04.stdout: op_features: clone-child 2026-03-25T15:30:01.102 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd export clone_v1 - 2026-03-25T15:30:01.102 INFO:tasks.workunit.client.0.vm04.stderr:++ md5sum 2026-03-25T15:30:01.139 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 100% complete...done. 2026-03-25T15:30:01.144 INFO:tasks.workunit.client.0.vm04.stderr:+ test '0a3f93b5049f28aed4df865d9c3745ba -' = '0a3f93b5049f28aed4df865d9c3745ba -' 2026-03-25T15:30:01.144 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd export clone_v2 - 2026-03-25T15:30:01.144 INFO:tasks.workunit.client.0.vm04.stderr:++ md5sum 2026-03-25T15:30:01.180 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 100% complete...done. 2026-03-25T15:30:01.185 INFO:tasks.workunit.client.0.vm04.stderr:+ test '0a3f93b5049f28aed4df865d9c3745ba -' = '0a3f93b5049f28aed4df865d9c3745ba -' 2026-03-25T15:30:01.185 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd children test1@snap1 2026-03-25T15:30:01.223 INFO:tasks.workunit.client.0.vm04.stderr:+ test rbd/clone_v1 = rbd/clone_v1 2026-03-25T15:30:01.224 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd children test1@snap2 2026-03-25T15:30:01.260 INFO:tasks.workunit.client.0.vm04.stderr:+ test rbd/clone_v2 = rbd/clone_v2 2026-03-25T15:30:01.260 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration prepare test1 rbd2/test2 2026-03-25T15:30:01.339 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:30:01.337+0000 7f4efcd29640 0 -- 192.168.123.104:0/3865142345 >> [v2:192.168.123.104:6816/2768043570,v1:192.168.123.104:6817/2768043570] conn(0x7f4ed40065b0 msgr2=0x7f4ed4026f20 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:30:01.693 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:30:01.692+0000 7f4efcd29640 0 -- 192.168.123.104:0/3865142345 >> [v2:192.168.123.104:6816/2768043570,v1:192.168.123.104:6817/2768043570] conn(0x7f4edc07ffb0 msgr2=0x7f4edc0a03b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:30:02.756 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info clone_v1 2026-03-25T15:30:02.756 INFO:tasks.workunit.client.0.vm04.stderr:+ fgrep 'parent: rbd2/test2@snap1' 2026-03-25T15:30:02.803 INFO:tasks.workunit.client.0.vm04.stdout: parent: rbd2/test2@snap1 2026-03-25T15:30:02.803 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info clone_v2 2026-03-25T15:30:02.803 INFO:tasks.workunit.client.0.vm04.stderr:+ fgrep 'parent: rbd2/test2@snap2' 2026-03-25T15:30:02.853 INFO:tasks.workunit.client.0.vm04.stdout: parent: rbd2/test2@snap2 2026-03-25T15:30:02.853 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info clone_v2 2026-03-25T15:30:02.853 INFO:tasks.workunit.client.0.vm04.stderr:+ fgrep 'op_features: clone-child' 2026-03-25T15:30:02.902 INFO:tasks.workunit.client.0.vm04.stdout: op_features: clone-child 2026-03-25T15:30:02.903 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd children rbd2/test2@snap1 2026-03-25T15:30:02.951 INFO:tasks.workunit.client.0.vm04.stderr:+ test rbd/clone_v1 = rbd/clone_v1 2026-03-25T15:30:02.951 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd children rbd2/test2@snap2 2026-03-25T15:30:02.996 INFO:tasks.workunit.client.0.vm04.stderr:+ test rbd/clone_v2 = rbd/clone_v2 2026-03-25T15:30:02.996 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration execute test1 2026-03-25T15:30:03.118 INFO:tasks.workunit.client.0.vm04.stderr: Image migration: 100% complete...done. 2026-03-25T15:30:03.122 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd migration commit test1 2026-03-25T15:30:03.122 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration commit test1 2026-03-25T15:30:03.182 INFO:tasks.workunit.client.0.vm04.stderr:rbd: the image has descendants: 2026-03-25T15:30:03.182 INFO:tasks.workunit.client.0.vm04.stderr: rbd/clone_v1 2026-03-25T15:30:03.182 INFO:tasks.workunit.client.0.vm04.stderr: rbd/clone_v2 2026-03-25T15:30:03.182 INFO:tasks.workunit.client.0.vm04.stderr:Warning: in-use, read-only descendant images will not detect the parent update. 2026-03-25T15:30:03.182 INFO:tasks.workunit.client.0.vm04.stderr:Ensure no descendant images are opened read-only and run again with force flag. 2026-03-25T15:30:03.185 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:30:03.185 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration commit test1 --force 2026-03-25T15:30:03.247 INFO:tasks.workunit.client.0.vm04.stderr:rbd: the image has descendants: 2026-03-25T15:30:03.247 INFO:tasks.workunit.client.0.vm04.stderr: rbd/clone_v1 2026-03-25T15:30:03.247 INFO:tasks.workunit.client.0.vm04.stderr: rbd/clone_v2 2026-03-25T15:30:03.247 INFO:tasks.workunit.client.0.vm04.stderr:Warning: in-use, read-only descendant images will not detect the parent update. 2026-03-25T15:30:03.247 INFO:tasks.workunit.client.0.vm04.stderr:Proceeding anyway due to force flag set. 2026-03-25T15:30:03.253 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:30:03.252+0000 7f542c444640 0 -- 192.168.123.104:0/3611272327 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x561347a097f0 msgr2=0x561347a95b80 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:30:03.261 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:30:03.259+0000 7f542b1bb640 0 -- 192.168.123.104:0/3611272327 >> [v2:192.168.123.104:6816/2768043570,v1:192.168.123.104:6817/2768043570] conn(0x7f5408005c10 msgr2=0x7f5408025ff0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:30:04.895 INFO:tasks.workunit.client.0.vm04.stderr: Commit image migration: 100% complete...done. 2026-03-25T15:30:04.899 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd export clone_v1 - 2026-03-25T15:30:04.899 INFO:tasks.workunit.client.0.vm04.stderr:++ md5sum 2026-03-25T15:30:04.943 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 100% complete...done. 2026-03-25T15:30:04.949 INFO:tasks.workunit.client.0.vm04.stderr:+ test '0a3f93b5049f28aed4df865d9c3745ba -' = '0a3f93b5049f28aed4df865d9c3745ba -' 2026-03-25T15:30:04.951 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd export clone_v2 - 2026-03-25T15:30:04.951 INFO:tasks.workunit.client.0.vm04.stderr:++ md5sum 2026-03-25T15:30:04.990 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 100% complete...done. 2026-03-25T15:30:04.995 INFO:tasks.workunit.client.0.vm04.stderr:+ test '0a3f93b5049f28aed4df865d9c3745ba -' = '0a3f93b5049f28aed4df865d9c3745ba -' 2026-03-25T15:30:04.995 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration prepare rbd2/test2 test1 2026-03-25T15:30:05.083 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:30:05.081+0000 7fb8c2d05640 0 -- 192.168.123.104:0/1428238564 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x56370ea914f0 msgr2=0x56370eab18d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:30:07.010 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:30:07.008+0000 7fb8c2d05640 0 -- 192.168.123.104:0/1428238564 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7fb8a405f440 msgr2=0x7fb8a407f840 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:30:07.122 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info clone_v1 2026-03-25T15:30:07.122 INFO:tasks.workunit.client.0.vm04.stderr:+ fgrep 'parent: rbd/test1@snap1' 2026-03-25T15:30:07.185 INFO:tasks.workunit.client.0.vm04.stdout: parent: rbd/test1@snap1 2026-03-25T15:30:07.186 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info clone_v2 2026-03-25T15:30:07.186 INFO:tasks.workunit.client.0.vm04.stderr:+ fgrep 'parent: rbd/test1@snap2' 2026-03-25T15:30:07.240 INFO:tasks.workunit.client.0.vm04.stdout: parent: rbd/test1@snap2 2026-03-25T15:30:07.240 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info clone_v2 2026-03-25T15:30:07.240 INFO:tasks.workunit.client.0.vm04.stderr:+ fgrep 'op_features: clone-child' 2026-03-25T15:30:07.295 INFO:tasks.workunit.client.0.vm04.stdout: op_features: clone-child 2026-03-25T15:30:07.295 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd children test1@snap1 2026-03-25T15:30:07.355 INFO:tasks.workunit.client.0.vm04.stderr:+ test rbd/clone_v1 = rbd/clone_v1 2026-03-25T15:30:07.355 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd children test1@snap2 2026-03-25T15:30:07.421 INFO:tasks.workunit.client.0.vm04.stderr:+ test rbd/clone_v2 = rbd/clone_v2 2026-03-25T15:30:07.421 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration execute test1 2026-03-25T15:30:07.584 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:30:07.582+0000 7ff7a07e4640 0 -- 192.168.123.104:0/3098168238 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x55ce0dc9c160 msgr2=0x55ce0dd07e10 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:30:07.588 INFO:tasks.workunit.client.0.vm04.stderr: Image migration: 100% complete...done. 2026-03-25T15:30:07.592 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd migration commit test1 2026-03-25T15:30:07.593 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration commit test1 2026-03-25T15:30:07.701 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:30:07.699+0000 7f210c616640 0 -- 192.168.123.104:0/1540043033 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7f20ec05f560 msgr2=0x7f20ec07f960 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:30:07.701 INFO:tasks.workunit.client.0.vm04.stderr:rbd: the image has descendants: 2026-03-25T15:30:07.701 INFO:tasks.workunit.client.0.vm04.stderr: rbd/clone_v1 2026-03-25T15:30:07.701 INFO:tasks.workunit.client.0.vm04.stderr: rbd/clone_v2 2026-03-25T15:30:07.701 INFO:tasks.workunit.client.0.vm04.stderr:Warning: in-use, read-only descendant images will not detect the parent update. 2026-03-25T15:30:07.701 INFO:tasks.workunit.client.0.vm04.stderr:Ensure no descendant images are opened read-only and run again with force flag. 2026-03-25T15:30:07.706 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:30:07.707 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration commit test1 --force 2026-03-25T15:30:07.803 INFO:tasks.workunit.client.0.vm04.stderr:rbd: the image has descendants: 2026-03-25T15:30:07.803 INFO:tasks.workunit.client.0.vm04.stderr: rbd/clone_v1 2026-03-25T15:30:07.803 INFO:tasks.workunit.client.0.vm04.stderr: rbd/clone_v2 2026-03-25T15:30:07.803 INFO:tasks.workunit.client.0.vm04.stderr:Warning: in-use, read-only descendant images will not detect the parent update. 2026-03-25T15:30:07.803 INFO:tasks.workunit.client.0.vm04.stderr:Proceeding anyway due to force flag set. 2026-03-25T15:30:07.804 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:30:07.803+0000 7f2a95fd0640 0 -- 192.168.123.104:0/1849305666 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7f2a7405f490 msgr2=0x7f2a7407f890 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:30:07.816 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:30:07.814+0000 7f2a95fd0640 0 -- 192.168.123.104:0/1849305666 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x562746f3e7f0 msgr2=0x7f2a740a0640 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:30:09.108 INFO:tasks.workunit.client.0.vm04.stderr: Commit image migration: 100% complete...done. 2026-03-25T15:30:09.117 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd export clone_v1 - 2026-03-25T15:30:09.117 INFO:tasks.workunit.client.0.vm04.stderr:++ md5sum 2026-03-25T15:30:09.158 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 100% complete...done. 2026-03-25T15:30:09.163 INFO:tasks.workunit.client.0.vm04.stderr:+ test '0a3f93b5049f28aed4df865d9c3745ba -' = '0a3f93b5049f28aed4df865d9c3745ba -' 2026-03-25T15:30:09.163 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd export clone_v2 - 2026-03-25T15:30:09.163 INFO:tasks.workunit.client.0.vm04.stderr:++ md5sum 2026-03-25T15:30:09.209 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 100% complete...done. 2026-03-25T15:30:09.214 INFO:tasks.workunit.client.0.vm04.stderr:+ test '0a3f93b5049f28aed4df865d9c3745ba -' = '0a3f93b5049f28aed4df865d9c3745ba -' 2026-03-25T15:30:09.214 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd remove clone_v1 2026-03-25T15:30:09.362 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:30:09.360+0000 7fbde99d8640 0 -- 192.168.123.104:0/3387804265 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x564612a24030 msgr2=0x564612a14370 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-25T15:30:09.367 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:30:09.372 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd remove clone_v2 2026-03-25T15:30:09.518 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:30:09.516+0000 7fc647280640 0 -- 192.168.123.104:0/2641812528 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x55809f9c7cc0 msgr2=0x55809f9a3310 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:30:09.526 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:30:09.530 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap unprotect test1@snap1 2026-03-25T15:30:09.602 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap purge test1 2026-03-25T15:30:11.055 INFO:tasks.workunit.client.0.vm04.stderr: Removing all snapshots: 50% complete... Removing all snapshots: 100% complete... Removing all snapshots: 100% complete...done. 2026-03-25T15:30:11.062 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test1 2026-03-25T15:30:11.139 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:30:11.138+0000 7f9ae5051640 0 -- 192.168.123.104:0/1940801956 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x560477c279b0 msgr2=0x560477d2b3d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:30:11.140 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:30:11.143 INFO:tasks.workunit.client.0.vm04.stderr:+ for format in 1 2 2026-03-25T15:30:11.143 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create -s 128M --image-format 1 test2 2026-03-25T15:30:11.164 INFO:tasks.workunit.client.0.vm04.stderr:rbd: image format 1 is deprecated 2026-03-25T15:30:11.170 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:30:11.169+0000 7f82f7edf300 -1 librbd: Forced V1 image creation. 2026-03-25T15:30:11.177 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration prepare test2 --data-pool rbd2 2026-03-25T15:30:11.232 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-25T15:30:11.270 INFO:tasks.workunit.client.0.vm04.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-25T15:30:11.273 INFO:tasks.workunit.client.0.vm04.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-25T15:30:11.273 INFO:tasks.workunit.client.0.vm04.stdout:elapsed: 0 ops: 1 ops/sec: 333.332 bytes/sec: 333 KiB/s 2026-03-25T15:30:11.279 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration abort test2 2026-03-25T15:30:11.344 INFO:tasks.workunit.client.0.vm04.stderr: Abort image migration: 1% complete... Abort image migration: 3% complete... Abort image migration: 4% complete... Abort image migration: 6% complete... Abort image migration: 7% complete... Abort image migration: 9% complete... Abort image migration: 10% complete... Abort image migration: 12% complete... Abort image migration: 14% complete... Abort image migration: 15% complete... Abort image migration: 17% complete... Abort image migration: 18% complete... Abort image migration: 20% complete... Abort image migration: 21% complete... Abort image migration: 23% complete... Abort image migration: 25% complete... Abort image migration: 26% complete... Abort image migration: 28% complete... Abort image migration: 29% complete... Abort image migration: 31% complete... Abort image migration: 32% complete... Abort image migration: 34% complete... Abort image migration: 35% complete... Abort image migration: 37% complete... Abort image migration: 39% complete... Abort image migration: 40% complete... Abort image migration: 42% complete... Abort image migration: 43% complete... Abort image migration: 45% complete... Abort image migration: 46% complete... Abort image migration: 48% complete... Abort image migration: 50% complete... Abort image migration: 51% complete... Abort image migration: 53% complete... Abort image migration: 54% complete... Abort image migration: 56% complete... Abort image migration: 57% complete... Abort image migration: 59% complete... Abort image migration: 60% complete... Abort image migration: 62% complete... Abort image migration: 64% complete... Abort image migration: 65% complete... Abort image migration: 67% complete... Abort image migration: 68% complete... Abort image migration: 70% complete... Abort image migration: 71% complete... Abort image migration: 73% complete... Abort image migration: 75% complete... Abort image migration: 76% complete... Abort image migration: 78% complete... Abort image migration: 79% complete... Abort image migration: 81% complete... Abort image migration: 82% complete... Abort image migration: 84% complete... Abort image migration: 85% complete... Abort image migration: 87% complete... Abort image migration: 89% complete... Abort image migration: 90% complete... Abort image migration: 92% complete... Abort image migration: 93% complete... Abort image migration: 95% complete... Abort image migration: 96% complete... Abort image migration: 98% complete...2026-03-25T15:30:11.343+0000 7f14d5e96640 0 -- 192.168.123.104:0/1810650448 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x5591b468e0c0 msgr2=0x5591b471aee0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:30:11.358 INFO:tasks.workunit.client.0.vm04.stderr: Abort image migration: 100% complete...done. 2026-03-25T15:30:11.361 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-25T15:30:11.395 INFO:tasks.workunit.client.0.vm04.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-25T15:30:11.398 INFO:tasks.workunit.client.0.vm04.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-25T15:30:11.398 INFO:tasks.workunit.client.0.vm04.stdout:elapsed: 0 ops: 1 ops/sec: 499.998 bytes/sec: 500 KiB/s 2026-03-25T15:30:11.402 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test2 2026-03-25T15:30:11.459 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete... Removing image: 100% complete...done. 2026-03-25T15:30:11.462 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create -s 128M --image-format 1 test2 2026-03-25T15:30:11.481 INFO:tasks.workunit.client.0.vm04.stderr:rbd: image format 1 is deprecated 2026-03-25T15:30:11.487 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:30:11.486+0000 7f45426fa300 -1 librbd: Forced V1 image creation. 2026-03-25T15:30:11.709 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration prepare test2 --data-pool rbd2 2026-03-25T15:30:11.770 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-25T15:30:11.875 INFO:tasks.workunit.client.0.vm04.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-25T15:30:11.878 INFO:tasks.workunit.client.0.vm04.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-25T15:30:11.878 INFO:tasks.workunit.client.0.vm04.stdout:elapsed: 0 ops: 1 ops/sec: 499.998 bytes/sec: 500 KiB/s 2026-03-25T15:30:11.883 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration execute test2 2026-03-25T15:30:11.940 INFO:tasks.workunit.client.0.vm04.stderr: Image migration: 3% complete... Image migration: 6% complete... Image migration: 9% complete... Image migration: 12% complete... Image migration: 15% complete... Image migration: 18% complete... Image migration: 21% complete... Image migration: 25% complete... Image migration: 28% complete... Image migration: 31% complete... Image migration: 34% complete... Image migration: 37% complete... Image migration: 40% complete... Image migration: 43% complete... Image migration: 46% complete... Image migration: 50% complete... Image migration: 53% complete... Image migration: 56% complete... Image migration: 59% complete... Image migration: 62% complete... Image migration: 65% complete... Image migration: 68% complete... Image migration: 71% complete... Image migration: 75% complete... Image migration: 78% complete... Image migration: 81% complete... Image migration: 84% complete... Image migration: 87% complete... Image migration: 90% complete... 2026-03-25T15:30:11.938+0000 7f2284530640 0 -- 192.168.123.104:0/3204948434 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x5637391d40c0 msgr2=0x563739260da0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:30:11.944 INFO:tasks.workunit.client.0.vm04.stderr:Image migration: 93% complete... Image migration: 96% complete...2026-03-25T15:30:11.942+0000 7f22832a7640 0 -- 192.168.123.104:0/3204948434 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7f225c008d20 msgr2=0x7f225c0291a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:30:11.948 INFO:tasks.workunit.client.0.vm04.stderr: Image migration: 100% complete...done. 2026-03-25T15:30:11.952 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration abort test2 2026-03-25T15:30:12.001 INFO:tasks.workunit.client.0.vm04.stderr: Abort image migration: 1% complete... Abort image migration: 3% complete... Abort image migration: 4% complete... Abort image migration: 6% complete... Abort image migration: 7% complete... Abort image migration: 9% complete... Abort image migration: 10% complete... Abort image migration: 12% complete... Abort image migration: 14% complete... Abort image migration: 15% complete... Abort image migration: 17% complete... Abort image migration: 18% complete... Abort image migration: 20% complete... Abort image migration: 21% complete... Abort image migration: 23% complete... Abort image migration: 25% complete... Abort image migration: 26% complete... Abort image migration: 28% complete... Abort image migration: 29% complete... Abort image migration: 31% complete... Abort image migration: 32% complete... Abort image migration: 34% complete... Abort image migration: 35% complete... Abort image migration: 37% complete... Abort image migration: 39% complete... Abort image migration: 40% complete... Abort image migration: 42% complete... Abort image migration: 43% complete... Abort image migration: 45% complete... Abort image migration: 46% complete... Abort image migration: 48% complete... Abort image migration: 50% complete... Abort image migration: 51% complete... Abort image migration: 53% complete... Abort image migration: 54% complete... Abort image migration: 56% complete... Abort image migration: 57% complete... Abort image migration: 59% complete... Abort image migration: 60% complete... Abort image migration: 62% complete... Abort image migration: 64% complete... Abort image migration: 65% complete... Abort image migration: 67% complete... Abort image migration: 68% complete... Abort image migration: 70% complete... Abort image migration: 71% complete... Abort image migration: 73% complete... Abort image migration: 75% complete... Abort image migration: 76% complete... Abort image migration: 78% complete... Abort image migration: 79% complete... Abort image migration: 81% complete... Abort image migration: 82% complete... Abort image migration: 84% complete... Abort image migration: 85% complete... Abort image migration: 87% complete... Abort image migration: 89% complete... Abort image migration: 90% complete... Abort image migration: 92% complete... Abort image migration: 93% complete... Abort image migration: 95% complete... Abort image migration: 96% complete... Abort image migration: 98% complete...2026-03-25T15:30:12.000+0000 7f909d0e4640 0 -- 192.168.123.104:0/1412252942 >> [v2:192.168.123.104:6816/2768043570,v1:192.168.123.104:6817/2768043570] conn(0x7f9074004a30 msgr2=0x7f9074025430 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:30:12.012 INFO:tasks.workunit.client.0.vm04.stderr: Abort image migration: 100% complete...done. 2026-03-25T15:30:12.016 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-25T15:30:12.049 INFO:tasks.workunit.client.0.vm04.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-25T15:30:12.052 INFO:tasks.workunit.client.0.vm04.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-25T15:30:12.052 INFO:tasks.workunit.client.0.vm04.stdout:elapsed: 0 ops: 1 ops/sec: 249.999 bytes/sec: 250 KiB/s 2026-03-25T15:30:12.057 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test2 2026-03-25T15:30:12.099 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete... Removing image: 100% complete...done. 2026-03-25T15:30:12.102 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create -s 128M --image-format 1 test2 2026-03-25T15:30:12.124 INFO:tasks.workunit.client.0.vm04.stderr:rbd: image format 1 is deprecated 2026-03-25T15:30:12.132 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:30:12.131+0000 7f944e117300 -1 librbd: Forced V1 image creation. 2026-03-25T15:30:12.140 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration prepare test2 --data-pool INVALID_DATA_POOL 2026-03-25T15:30:12.178 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:30:12.176+0000 7f55e00e1300 -1 librbd::image::CreateRequest: 0x55fe62f47f10 validate_data_pool: data pool INVALID_DATA_POOL does not exist 2026-03-25T15:30:12.178 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:30:12.176+0000 7f55e00e1300 -1 librbd::Migration: create_dst_image: header creation failed: (2) No such file or directory 2026-03-25T15:30:12.183 INFO:tasks.workunit.client.0.vm04.stderr:rbd: preparing migration failed: (2) No such file or directory 2026-03-25T15:30:12.186 INFO:tasks.workunit.client.0.vm04.stderr:+ true 2026-03-25T15:30:12.186 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-25T15:30:12.217 INFO:tasks.workunit.client.0.vm04.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-25T15:30:12.219 INFO:tasks.workunit.client.0.vm04.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-25T15:30:12.219 INFO:tasks.workunit.client.0.vm04.stdout:elapsed: 0 ops: 1 ops/sec: 333.332 bytes/sec: 333 KiB/s 2026-03-25T15:30:12.224 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test2 2026-03-25T15:30:12.268 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete... Removing image: 100% complete...done. 2026-03-25T15:30:12.272 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create -s 128M --image-format 1 test2 2026-03-25T15:30:12.292 INFO:tasks.workunit.client.0.vm04.stderr:rbd: image format 1 is deprecated 2026-03-25T15:30:12.300 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:30:12.298+0000 7f9712366300 -1 librbd: Forced V1 image creation. 2026-03-25T15:30:12.307 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration prepare test2 rbd2/test2 2026-03-25T15:30:12.360 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 rbd2/test2 2026-03-25T15:30:12.396 INFO:tasks.workunit.client.0.vm04.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-25T15:30:12.399 INFO:tasks.workunit.client.0.vm04.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-25T15:30:12.399 INFO:tasks.workunit.client.0.vm04.stdout:elapsed: 0 ops: 1 ops/sec: 333.332 bytes/sec: 333 KiB/s 2026-03-25T15:30:12.405 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration abort test2 2026-03-25T15:30:12.462 INFO:tasks.workunit.client.0.vm04.stderr: Abort image migration: 1% complete... Abort image migration: 3% complete... Abort image migration: 4% complete... Abort image migration: 6% complete... Abort image migration: 7% complete... Abort image migration: 9% complete... Abort image migration: 10% complete... Abort image migration: 12% complete... Abort image migration: 14% complete... Abort image migration: 15% complete... Abort image migration: 17% complete... Abort image migration: 18% complete... Abort image migration: 20% complete... Abort image migration: 21% complete... Abort image migration: 23% complete... Abort image migration: 25% complete... Abort image migration: 26% complete... Abort image migration: 28% complete... Abort image migration: 29% complete... Abort image migration: 31% complete... Abort image migration: 32% complete... Abort image migration: 34% complete... Abort image migration: 35% complete... Abort image migration: 37% complete... Abort image migration: 39% complete... Abort image migration: 40% complete... Abort image migration: 42% complete... Abort image migration: 43% complete... Abort image migration: 45% complete... Abort image migration: 46% complete... Abort image migration: 48% complete... Abort image migration: 50% complete... Abort image migration: 51% complete... Abort image migration: 53% complete... Abort image migration: 54% complete... Abort image migration: 56% complete... Abort image migration: 57% complete... Abort image migration: 59% complete... Abort image migration: 60% complete... Abort image migration: 62% complete... Abort image migration: 64% complete... Abort image migration: 65% complete... Abort image migration: 67% complete... Abort image migration: 68% complete... Abort image migration: 70% complete... Abort image migration: 71% complete... Abort image migration: 73% complete... Abort image migration: 75% complete... Abort image migration: 76% complete... Abort image migration: 78% complete... Abort image migration: 79% complete... Abort image migration: 81% complete... Abort image migration: 82% complete... Abort image migration: 84% complete... Abort image migration: 85% complete... Abort image migration: 87% complete... Abort image migration: 89% complete... Abort image migration: 90% complete... Abort image migration: 92% complete... Abort image migration: 93% complete... Abort image migration: 95% complete... Abort image migration: 96% complete... Abort image migration: 98% complete... Abort image migration: 100% complete...done. 2026-03-25T15:30:12.465 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-25T15:30:12.495 INFO:tasks.workunit.client.0.vm04.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-25T15:30:12.498 INFO:tasks.workunit.client.0.vm04.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-25T15:30:12.498 INFO:tasks.workunit.client.0.vm04.stdout:elapsed: 0 ops: 1 ops/sec: 499.998 bytes/sec: 500 KiB/s 2026-03-25T15:30:12.502 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test2 2026-03-25T15:30:12.556 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete... Removing image: 100% complete...done. 2026-03-25T15:30:12.559 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create -s 128M --image-format 1 test2 2026-03-25T15:30:12.578 INFO:tasks.workunit.client.0.vm04.stderr:rbd: image format 1 is deprecated 2026-03-25T15:30:12.584 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:30:12.583+0000 7fe97a23b300 -1 librbd: Forced V1 image creation. 2026-03-25T15:30:12.592 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration prepare test2 rbd2/test2 2026-03-25T15:30:12.645 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration abort rbd2/test2 2026-03-25T15:30:12.698 INFO:tasks.workunit.client.0.vm04.stderr: Abort image migration: 1% complete... Abort image migration: 3% complete... Abort image migration: 4% complete... Abort image migration: 6% complete... Abort image migration: 7% complete... Abort image migration: 9% complete... Abort image migration: 10% complete... Abort image migration: 12% complete... Abort image migration: 14% complete... Abort image migration: 15% complete... Abort image migration: 17% complete... Abort image migration: 18% complete... Abort image migration: 20% complete... Abort image migration: 21% complete... Abort image migration: 23% complete... Abort image migration: 25% complete... Abort image migration: 26% complete... Abort image migration: 28% complete... Abort image migration: 29% complete... Abort image migration: 31% complete... Abort image migration: 32% complete... Abort image migration: 34% complete... Abort image migration: 35% complete... Abort image migration: 37% complete... Abort image migration: 39% complete... Abort image migration: 40% complete... Abort image migration: 42% complete... Abort image migration: 43% complete... Abort image migration: 45% complete... Abort image migration: 46% complete... Abort image migration: 48% complete... Abort image migration: 50% complete... Abort image migration: 51% complete... Abort image migration: 53% complete... Abort image migration: 54% complete... Abort image migration: 56% complete... Abort image migration: 57% complete... Abort image migration: 59% complete... Abort image migration: 60% complete... Abort image migration: 62% complete... Abort image migration: 64% complete... Abort image migration: 65% complete... Abort image migration: 67% complete... Abort image migration: 68% complete... Abort image migration: 70% complete... Abort image migration: 71% complete... Abort image migration: 73% complete... Abort image migration: 75% complete... Abort image migration: 76% complete... Abort image migration: 78% complete... Abort image migration: 79% complete... Abort image migration: 81% complete... Abort image migration: 82% complete... Abort image migration: 84% complete... Abort image migration: 85% complete... Abort image migration: 87% complete... Abort image migration: 89% complete... Abort image migration: 90% complete... Abort image migration: 92% complete... Abort image migration: 93% complete... Abort image migration: 95% complete... Abort image migration: 96% complete... Abort image migration: 98% complete...2026-03-25T15:30:12.697+0000 7fb1df175640 0 -- 192.168.123.104:0/376461855 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x55927c3a10c0 msgr2=0x55927c42dce0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:30:12.713 INFO:tasks.workunit.client.0.vm04.stderr: Abort image migration: 100% complete...done. 2026-03-25T15:30:12.719 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-25T15:30:12.763 INFO:tasks.workunit.client.0.vm04.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-25T15:30:12.766 INFO:tasks.workunit.client.0.vm04.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-25T15:30:12.766 INFO:tasks.workunit.client.0.vm04.stdout:elapsed: 0 ops: 1 ops/sec: 249.999 bytes/sec: 250 KiB/s 2026-03-25T15:30:12.774 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test2 2026-03-25T15:30:12.832 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete... Removing image: 100% complete...done. 2026-03-25T15:30:12.836 INFO:tasks.workunit.client.0.vm04.stderr:+ test 1 = 1 2026-03-25T15:30:12.836 INFO:tasks.workunit.client.0.vm04.stderr:+ continue 2026-03-25T15:30:12.836 INFO:tasks.workunit.client.0.vm04.stderr:+ for format in 1 2 2026-03-25T15:30:12.836 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create -s 128M --image-format 2 test2 2026-03-25T15:30:12.889 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration prepare test2 --data-pool rbd2 2026-03-25T15:30:12.973 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:30:12.971+0000 7f30377fe640 0 -- 192.168.123.104:0/3294003600 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7f301400ff20 msgr2=0x7f3014010f10 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:30:12.982 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-25T15:30:13.029 INFO:tasks.workunit.client.0.vm04.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-25T15:30:13.034 INFO:tasks.workunit.client.0.vm04.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-25T15:30:13.034 INFO:tasks.workunit.client.0.vm04.stdout:elapsed: 0 ops: 1 ops/sec: 249.999 bytes/sec: 250 KiB/s 2026-03-25T15:30:13.053 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration abort test2 2026-03-25T15:30:13.139 INFO:tasks.workunit.client.0.vm04.stderr: Abort image migration: 1% complete... Abort image migration: 3% complete... Abort image migration: 4% complete... Abort image migration: 6% complete... Abort image migration: 7% complete... Abort image migration: 9% complete... Abort image migration: 10% complete... Abort image migration: 12% complete... Abort image migration: 14% complete... Abort image migration: 15% complete... Abort image migration: 17% complete... Abort image migration: 18% complete... Abort image migration: 20% complete... Abort image migration: 21% complete... Abort image migration: 23% complete... Abort image migration: 25% complete... Abort image migration: 26% complete... Abort image migration: 28% complete... Abort image migration: 29% complete... Abort image migration: 31% complete... Abort image migration: 32% complete... Abort image migration: 34% complete... Abort image migration: 35% complete... Abort image migration: 37% complete... Abort image migration: 39% complete... Abort image migration: 40% complete... Abort image migration: 42% complete... Abort image migration: 43% complete... Abort image migration: 45% complete... Abort image migration: 46% complete... Abort image migration: 48% complete... Abort image migration: 50% complete... Abort image migration: 51% complete... Abort image migration: 53% complete... Abort image migration: 54% complete... Abort image migration: 56% complete... Abort image migration: 57% complete... Abort image migration: 59% complete... Abort image migration: 60% complete... Abort image migration: 62% complete... Abort image migration: 64% complete... Abort image migration: 65% complete... Abort image migration: 67% complete... Abort image migration: 68% complete... Abort image migration: 70% complete... Abort image migration: 71% complete... Abort image migration: 73% complete... Abort image migration: 75% complete... Abort image migration: 76% complete... Abort image migration: 78% complete... Abort image migration: 79% complete... Abort image migration: 81% complete... Abort image migration: 82% complete... Abort image migration: 84% complete... Abort image migration: 85% complete... Abort image migration: 87% complete... Abort image migration: 89% complete... Abort image migration: 90% complete... Abort image migration: 92% complete... Abort image migration: 93% complete... Abort image migration: 95% complete... Abort image migration: 96% complete... Abort image migration: 98% complete...2026-03-25T15:30:13.137+0000 7f680161b640 0 -- 192.168.123.104:0/1619332973 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7f67d8008d20 msgr2=0x7f67d80291a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-25T15:30:13.143 INFO:tasks.workunit.client.0.vm04.stderr: Abort image migration: 100% complete...done. 2026-03-25T15:30:13.147 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-25T15:30:13.185 INFO:tasks.workunit.client.0.vm04.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-25T15:30:13.187 INFO:tasks.workunit.client.0.vm04.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-25T15:30:13.187 INFO:tasks.workunit.client.0.vm04.stdout:elapsed: 0 ops: 1 ops/sec: 999.995 bytes/sec: 1000 KiB/s 2026-03-25T15:30:13.194 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test2 2026-03-25T15:30:13.264 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete...2026-03-25T15:30:13.263+0000 7f99bd464640 0 -- 192.168.123.104:0/1015843864 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x559194fdfcc0 msgr2=0x5591950316d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:30:13.274 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:30:13.278 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create -s 128M --image-format 2 test2 2026-03-25T15:30:13.316 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration prepare test2 --data-pool rbd2 2026-03-25T15:30:13.394 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:30:13.392+0000 7f14bbfff640 0 -- 192.168.123.104:0/2214099354 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7f149c026110 msgr2=0x7f149c0464f0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:30:13.401 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-25T15:30:13.442 INFO:tasks.workunit.client.0.vm04.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-25T15:30:13.446 INFO:tasks.workunit.client.0.vm04.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-25T15:30:13.446 INFO:tasks.workunit.client.0.vm04.stdout:elapsed: 0 ops: 1 ops/sec: 249.999 bytes/sec: 250 KiB/s 2026-03-25T15:30:13.454 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration execute test2 2026-03-25T15:30:13.514 INFO:tasks.workunit.client.0.vm04.stderr: Image migration: 3% complete... Image migration: 6% complete... Image migration: 9% complete... Image migration: 12% complete... Image migration: 15% complete... Image migration: 18% complete... Image migration: 21% complete... Image migration: 25% complete... Image migration: 28% complete... Image migration: 31% complete... Image migration: 34% complete... Image migration: 37% complete... Image migration: 40% complete... Image migration: 43% complete... Image migration: 46% complete... Image migration: 50% complete... Image migration: 53% complete... Image migration: 56% complete... Image migration: 59% complete... Image migration: 62% complete... Image migration: 65% complete... Image migration: 68% complete... Image migration: 71% complete... Image migration: 75% complete... Image migration: 78% complete... Image migration: 81% complete... Image migration: 84% complete... Image migration: 87% complete... Image migration: 90% complete... Image migration: 93% complete... Image migration: 96% complete...2026-03-25T15:30:13.513+0000 7fadcbfff640 0 -- 192.168.123.104:0/1505414867 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7fada8008d20 msgr2=0x7fada80291a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:30:13.526 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:30:13.525+0000 7fadcbfff640 0 -- 192.168.123.104:0/1505414867 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7fadb005f530 msgr2=0x7fadb007f930 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:30:13.528 INFO:tasks.workunit.client.0.vm04.stderr: Image migration: 100% complete...done. 2026-03-25T15:30:13.532 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration abort test2 2026-03-25T15:30:13.626 INFO:tasks.workunit.client.0.vm04.stderr: Abort image migration: 1% complete... Abort image migration: 3% complete... Abort image migration: 4% complete... Abort image migration: 6% complete... Abort image migration: 7% complete... Abort image migration: 9% complete... Abort image migration: 10% complete... Abort image migration: 12% complete... Abort image migration: 14% complete... Abort image migration: 15% complete... Abort image migration: 17% complete... Abort image migration: 18% complete... Abort image migration: 20% complete... Abort image migration: 21% complete... Abort image migration: 23% complete... Abort image migration: 25% complete... Abort image migration: 26% complete... Abort image migration: 28% complete... Abort image migration: 29% complete... Abort image migration: 31% complete... Abort image migration: 32% complete... Abort image migration: 34% complete... Abort image migration: 35% complete... Abort image migration: 37% complete... Abort image migration: 39% complete... Abort image migration: 40% complete... Abort image migration: 42% complete... Abort image migration: 43% complete... Abort image migration: 45% complete... Abort image migration: 46% complete... Abort image migration: 48% complete... Abort image migration: 50% complete... Abort image migration: 51% complete... Abort image migration: 53% complete... Abort image migration: 54% complete... Abort image migration: 56% complete... Abort image migration: 57% complete... Abort image migration: 59% complete... Abort image migration: 60% complete... Abort image migration: 62% complete... Abort image migration: 64% complete... Abort image migration: 65% complete... Abort image migration: 67% complete... Abort image migration: 68% complete... Abort image migration: 70% complete... Abort image migration: 71% complete... Abort image migration: 73% complete... Abort image migration: 75% complete... Abort image migration: 76% complete... Abort image migration: 78% complete... Abort image migration: 79% complete... Abort image migration: 81% complete... Abort image migration: 82% complete... Abort image migration: 84% complete... Abort image migration: 85% complete... Abort image migration: 87% complete... Abort image migration: 89% complete... Abort image migration: 90% complete... Abort image migration: 92% complete... Abort image migration: 93% complete... Abort image migration: 95% complete... Abort image migration: 96% complete... Abort image migration: 98% complete...2026-03-25T15:30:13.624+0000 7f666b2a4640 0 -- 192.168.123.104:0/1980586088 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x55e6545db0c0 msgr2=0x55e654667b80 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:30:13.631 INFO:tasks.workunit.client.0.vm04.stderr: Abort image migration: 100% complete...done. 2026-03-25T15:30:13.634 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-25T15:30:13.681 INFO:tasks.workunit.client.0.vm04.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-25T15:30:13.683 INFO:tasks.workunit.client.0.vm04.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-25T15:30:13.683 INFO:tasks.workunit.client.0.vm04.stdout:elapsed: 0 ops: 1 ops/sec: 333.332 bytes/sec: 333 KiB/s 2026-03-25T15:30:13.692 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test2 2026-03-25T15:30:13.780 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete...2026-03-25T15:30:13.778+0000 7fbc6a87c640 0 -- 192.168.123.104:0/4146779470 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x7fbc44008d20 msgr2=0x7fbc440291a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-25T15:30:13.789 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:30:13.793 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create -s 128M --image-format 2 test2 2026-03-25T15:30:13.832 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration prepare test2 --data-pool INVALID_DATA_POOL 2026-03-25T15:30:13.887 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:30:13.885+0000 7fe200230300 -1 librbd::image::CreateRequest: 0x55c337ea5f10 validate_data_pool: data pool INVALID_DATA_POOL does not exist 2026-03-25T15:30:13.887 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:30:13.885+0000 7fe200230300 -1 librbd::Migration: create_dst_image: header creation failed: (2) No such file or directory 2026-03-25T15:30:13.904 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:30:13.902+0000 7fe1fece5640 0 -- 192.168.123.104:0/807586830 >> [v2:192.168.123.104:6816/2768043570,v1:192.168.123.104:6817/2768043570] conn(0x7fe1e005f600 msgr2=0x7fe1e007fa00 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:30:13.904 INFO:tasks.workunit.client.0.vm04.stderr:rbd: preparing migration failed: (2) No such file or directory 2026-03-25T15:30:13.908 INFO:tasks.workunit.client.0.vm04.stderr:+ true 2026-03-25T15:30:13.908 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-25T15:30:14.076 INFO:tasks.workunit.client.0.vm04.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-25T15:30:14.079 INFO:tasks.workunit.client.0.vm04.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-25T15:30:14.079 INFO:tasks.workunit.client.0.vm04.stdout:elapsed: 0 ops: 1 ops/sec: 333.332 bytes/sec: 333 KiB/s 2026-03-25T15:30:14.087 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test2 2026-03-25T15:30:14.178 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete...2026-03-25T15:30:14.177+0000 7f22c98be640 0 -- 192.168.123.104:0/2845569490 >> [v2:192.168.123.104:6816/2768043570,v1:192.168.123.104:6817/2768043570] conn(0x55d02a71c0f0 msgr2=0x55d02a73c570 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:30:14.185 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:30:14.189 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create -s 128M --image-format 2 test2 2026-03-25T15:30:14.247 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration prepare test2 rbd2/test2 2026-03-25T15:30:14.336 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:30:14.335+0000 7f46d37fe640 0 -- 192.168.123.104:0/2121740262 >> [v2:192.168.123.104:6816/2768043570,v1:192.168.123.104:6817/2768043570] conn(0x7f46b4004a30 msgr2=0x7f46b4025430 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:30:14.344 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 rbd2/test2 2026-03-25T15:30:14.625 INFO:tasks.workunit.client.0.vm04.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-25T15:30:14.627 INFO:tasks.workunit.client.0.vm04.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-25T15:30:14.627 INFO:tasks.workunit.client.0.vm04.stdout:elapsed: 0 ops: 1 ops/sec: 333.332 bytes/sec: 333 KiB/s 2026-03-25T15:30:14.635 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration abort test2 2026-03-25T15:30:14.705 INFO:tasks.workunit.client.0.vm04.stderr: Abort image migration: 1% complete... Abort image migration: 3% complete... Abort image migration: 4% complete... Abort image migration: 6% complete... Abort image migration: 7% complete... Abort image migration: 9% complete... Abort image migration: 10% complete... Abort image migration: 12% complete... Abort image migration: 14% complete... Abort image migration: 15% complete... Abort image migration: 17% complete... Abort image migration: 18% complete... Abort image migration: 20% complete... Abort image migration: 21% complete... Abort image migration: 23% complete... Abort image migration: 25% complete... Abort image migration: 26% complete... Abort image migration: 28% complete... Abort image migration: 29% complete... Abort image migration: 31% complete... Abort image migration: 32% complete... Abort image migration: 34% complete... Abort image migration: 35% complete... Abort image migration: 37% complete... Abort image migration: 39% complete... Abort image migration: 40% complete... Abort image migration: 42% complete... Abort image migration: 43% complete... Abort image migration: 45% complete... Abort image migration: 46% complete... Abort image migration: 48% complete... Abort image migration: 50% complete... Abort image migration: 51% complete... Abort image migration: 53% complete... Abort image migration: 54% complete... Abort image migration: 56% complete... Abort image migration: 57% complete... Abort image migration: 59% complete... Abort image migration: 60% complete... Abort image migration: 62% complete... Abort image migration: 64% complete... Abort image migration: 65% complete... Abort image migration: 67% complete... Abort image migration: 68% complete... Abort image migration: 70% complete... Abort image migration: 71% complete... Abort image migration: 73% complete... Abort image migration: 75% complete... Abort image migration: 76% complete... Abort image migration: 78% complete... Abort image migration: 79% complete... Abort image migration: 81% complete... Abort image migration: 82% complete... Abort image migration: 84% complete... Abort image migration: 85% complete... Abort image migration: 87% complete... Abort image migration: 89% complete... Abort image migration: 90% complete... Abort image migration: 92% complete... Abort image migration: 93% complete... Abort image migration: 95% complete... Abort image migration: 96% complete... Abort image migration: 98% complete... Abort image migration: 100% complete...done. 2026-03-25T15:30:14.708 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-25T15:30:14.745 INFO:tasks.workunit.client.0.vm04.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-25T15:30:14.747 INFO:tasks.workunit.client.0.vm04.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-25T15:30:14.747 INFO:tasks.workunit.client.0.vm04.stdout:elapsed: 0 ops: 1 ops/sec: 333.332 bytes/sec: 333 KiB/s 2026-03-25T15:30:14.756 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test2 2026-03-25T15:30:14.841 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete...2026-03-25T15:30:14.840+0000 7f1116d04640 0 -- 192.168.123.104:0/3443995296 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x555eefc83c30 msgr2=0x555eefc646b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:30:14.846 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:30:14.850 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create -s 128M --image-format 2 test2 2026-03-25T15:30:14.890 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration prepare test2 rbd2/test2 2026-03-25T15:30:15.176 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:30:15.174+0000 7fc0b25ff640 0 -- 192.168.123.104:0/1925618984 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7fc094008d20 msgr2=0x7fc0940291a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:30:15.182 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration abort rbd2/test2 2026-03-25T15:30:15.263 INFO:tasks.workunit.client.0.vm04.stderr: Abort image migration: 1% complete... Abort image migration: 3% complete... Abort image migration: 4% complete... Abort image migration: 6% complete... Abort image migration: 7% complete... Abort image migration: 9% complete... Abort image migration: 10% complete... Abort image migration: 12% complete... Abort image migration: 14% complete... Abort image migration: 15% complete... Abort image migration: 17% complete... Abort image migration: 18% complete... Abort image migration: 20% complete... Abort image migration: 21% complete... Abort image migration: 23% complete... Abort image migration: 25% complete... Abort image migration: 26% complete... Abort image migration: 28% complete... Abort image migration: 29% complete... Abort image migration: 31% complete... Abort image migration: 32% complete... Abort image migration: 34% complete... Abort image migration: 35% complete... Abort image migration: 37% complete... Abort image migration: 39% complete... Abort image migration: 40% complete... Abort image migration: 42% complete... Abort image migration: 43% complete... Abort image migration: 45% complete... Abort image migration: 46% complete... Abort image migration: 48% complete... Abort image migration: 50% complete... Abort image migration: 51% complete... Abort image migration: 53% complete... Abort image migration: 54% complete... Abort image migration: 56% complete... Abort image migration: 57% complete... Abort image migration: 59% complete... Abort image migration: 60% complete... Abort image migration: 62% complete... Abort image migration: 64% complete... Abort image migration: 65% complete... Abort image migration: 67% complete... Abort image migration: 68% complete... Abort image migration: 70% complete... Abort image migration: 71% complete... Abort image migration: 73% complete... Abort image migration: 75% complete... Abort image migration: 76% complete... Abort image migration: 78% complete... Abort image migration: 79% complete... Abort image migration: 81% complete... Abort image migration: 82% complete... Abort image migration: 84% complete... Abort image migration: 85% complete... Abort image migration: 87% complete... Abort image migration: 89% complete... Abort image migration: 90% complete... Abort image migration: 92% complete... Abort image migration: 93% complete... Abort image migration: 95% complete... Abort image migration: 96% complete... Abort image migration: 98% complete...2026-03-25T15:30:15.261+0000 7fca299b8640 0 -- 192.168.123.104:0/4092231584 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x55890813a4e0 msgr2=0x5589081c6b90 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:30:15.265 INFO:tasks.workunit.client.0.vm04.stderr: Abort image migration: 100% complete...done. 2026-03-25T15:30:15.269 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-25T15:30:15.310 INFO:tasks.workunit.client.0.vm04.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-25T15:30:15.315 INFO:tasks.workunit.client.0.vm04.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-25T15:30:15.315 INFO:tasks.workunit.client.0.vm04.stdout:elapsed: 0 ops: 1 ops/sec: 249.999 bytes/sec: 250 KiB/s 2026-03-25T15:30:15.323 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test2 2026-03-25T15:30:15.408 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete...2026-03-25T15:30:15.407+0000 7f7d55216640 0 -- 192.168.123.104:0/4092556624 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x5567cedfbcc0 msgr2=0x5567cedd74a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:30:15.414 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:30:15.417 INFO:tasks.workunit.client.0.vm04.stderr:+ test 2 = 1 2026-03-25T15:30:15.417 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create -s 128M --image-format 2 test2 2026-03-25T15:30:15.460 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration prepare test2 rbd2/ns1/test3 2026-03-25T15:30:15.541 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:30:15.540+0000 7f60f35ff640 0 -- 192.168.123.104:0/2021257368 >> [v2:192.168.123.104:6816/2768043570,v1:192.168.123.104:6817/2768043570] conn(0x7f60d8004a30 msgr2=0x7f60d8024e10 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-25T15:30:15.549 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 rbd2/ns1/test3 2026-03-25T15:30:15.588 INFO:tasks.workunit.client.0.vm04.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-25T15:30:15.591 INFO:tasks.workunit.client.0.vm04.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-25T15:30:15.591 INFO:tasks.workunit.client.0.vm04.stdout:elapsed: 0 ops: 1 ops/sec: 499.998 bytes/sec: 500 KiB/s 2026-03-25T15:30:15.597 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd migration abort test2 2026-03-25T15:30:15.685 INFO:tasks.workunit.client.0.vm04.stderr: Abort image migration: 1% complete... Abort image migration: 3% complete... Abort image migration: 4% complete... Abort image migration: 6% complete... Abort image migration: 7% complete... Abort image migration: 9% complete... Abort image migration: 10% complete... Abort image migration: 12% complete... Abort image migration: 14% complete... Abort image migration: 15% complete... Abort image migration: 17% complete... Abort image migration: 18% complete... Abort image migration: 20% complete... Abort image migration: 21% complete... Abort image migration: 23% complete... Abort image migration: 25% complete... Abort image migration: 26% complete... Abort image migration: 28% complete... Abort image migration: 29% complete... Abort image migration: 31% complete... Abort image migration: 32% complete... Abort image migration: 34% complete... Abort image migration: 35% complete... Abort image migration: 37% complete... Abort image migration: 39% complete... Abort image migration: 40% complete... Abort image migration: 42% complete... Abort image migration: 43% complete... Abort image migration: 45% complete... Abort image migration: 46% complete... Abort image migration: 48% complete... Abort image migration: 50% complete... Abort image migration: 51% complete... Abort image migration: 53% complete... Abort image migration: 54% complete... Abort image migration: 56% complete... Abort image migration: 57% complete... Abort image migration: 59% complete... Abort image migration: 60% complete... Abort image migration: 62% complete... Abort image migration: 64% complete... Abort image migration: 65% complete... Abort image migration: 67% complete... Abort image migration: 68% complete... Abort image migration: 70% complete... Abort image migration: 71% complete... Abort image migration: 73% complete... Abort image migration: 75% complete... Abort image migration: 76% complete... Abort image migration: 78% complete... Abort image migration: 79% complete... Abort image migration: 81% complete... Abort image migration: 82% complete... Abort image migration: 84% complete... Abort image migration: 85% complete... Abort image migration: 87% complete... Abort image migration: 89% complete... Abort image migration: 90% complete... Abort image migration: 92% complete... Abort image migration: 93% complete... Abort image migration: 95% complete... Abort image migration: 96% complete... Abort image migration: 98% complete... Abort image migration: 100% complete...done. 2026-03-25T15:30:15.689 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-25T15:30:15.730 INFO:tasks.workunit.client.0.vm04.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-25T15:30:15.732 INFO:tasks.workunit.client.0.vm04.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-25T15:30:15.732 INFO:tasks.workunit.client.0.vm04.stdout:elapsed: 0 ops: 1 ops/sec: 499.998 bytes/sec: 500 KiB/s 2026-03-25T15:30:15.740 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test2 2026-03-25T15:30:15.833 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete...2026-03-25T15:30:15.831+0000 7fe35628c640 0 -- 192.168.123.104:0/3642056439 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x559148a60030 msgr2=0x559148a4fe30 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:30:15.844 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:30:15.848 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-25T15:30:15.848 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:15.931 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:16.011 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:16.091 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:16.167 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:16.236 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:16.306 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:16.376 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:16.452 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:16.533 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:16.609 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:16.681 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:16.762 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:16.832 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:16.916 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:16.988 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:17.063 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:17.143 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:17.213 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool rm rbd2 rbd2 --yes-i-really-really-mean-it 2026-03-25T15:30:17.766 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd2' does not exist 2026-03-25T15:30:17.776 INFO:tasks.workunit.client.0.vm04.stderr:+ test_config 2026-03-25T15:30:17.776 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing config...' 2026-03-25T15:30:17.776 INFO:tasks.workunit.client.0.vm04.stdout:testing config... 2026-03-25T15:30:17.776 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-25T15:30:17.776 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:17.877 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:17.968 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:18.048 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:18.133 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:18.224 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:18.297 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:18.371 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:18.442 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:18.517 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:18.597 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:18.669 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:18.749 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:18.825 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:18.901 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:18.974 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:19.051 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:19.131 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:19.230 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd config global set osd rbd_cache true 2026-03-25T15:30:19.230 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config global set osd rbd_cache true 2026-03-25T15:30:19.249 INFO:tasks.workunit.client.0.vm04.stderr:rbd: invalid config entity: osd (must be global, client or client.) 2026-03-25T15:30:19.251 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:30:19.251 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd config global set global debug_ms 10 2026-03-25T15:30:19.251 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config global set global debug_ms 10 2026-03-25T15:30:19.270 INFO:tasks.workunit.client.0.vm04.stderr:rbd: not rbd option: debug_ms 2026-03-25T15:30:19.272 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:30:19.272 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd config global set global rbd_UNKNOWN false 2026-03-25T15:30:19.272 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config global set global rbd_UNKNOWN false 2026-03-25T15:30:19.299 INFO:tasks.workunit.client.0.vm04.stderr:rbd: invalid config key: rbd_UNKNOWN 2026-03-25T15:30:19.300 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:30:19.300 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd config global set global rbd_cache INVALID 2026-03-25T15:30:19.300 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config global set global rbd_cache INVALID 2026-03-25T15:30:19.327 INFO:tasks.workunit.client.0.vm04.stderr:rbd: error setting rbd_cache: error parsing value: Expected option value to be integer, got 'INVALID' 2026-03-25T15:30:19.330 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:30:19.330 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config global set global rbd_cache false 2026-03-25T15:30:19.366 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config global set client rbd_cache true 2026-03-25T15:30:19.482 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config global set client.123 rbd_cache false 2026-03-25T15:30:19.516 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config global get global rbd_cache 2026-03-25T15:30:19.516 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^false$' 2026-03-25T15:30:19.543 INFO:tasks.workunit.client.0.vm04.stdout:false 2026-03-25T15:30:19.544 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config global get client rbd_cache 2026-03-25T15:30:19.544 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^true$' 2026-03-25T15:30:19.571 INFO:tasks.workunit.client.0.vm04.stdout:true 2026-03-25T15:30:19.572 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config global get client.123 rbd_cache 2026-03-25T15:30:19.572 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^false$' 2026-03-25T15:30:19.599 INFO:tasks.workunit.client.0.vm04.stdout:false 2026-03-25T15:30:19.600 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd config global get client.UNKNOWN rbd_cache 2026-03-25T15:30:19.600 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config global get client.UNKNOWN rbd_cache 2026-03-25T15:30:19.828 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd_cache is not set 2026-03-25T15:30:19.831 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:30:19.831 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config global list global 2026-03-25T15:30:19.832 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^rbd_cache * false * global *$' 2026-03-25T15:30:19.861 INFO:tasks.workunit.client.0.vm04.stdout:rbd_cache false global 2026-03-25T15:30:19.862 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config global list client 2026-03-25T15:30:19.862 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^rbd_cache * true * client *$' 2026-03-25T15:30:19.893 INFO:tasks.workunit.client.0.vm04.stdout:rbd_cache true client 2026-03-25T15:30:19.893 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config global list client.123 2026-03-25T15:30:19.893 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^rbd_cache * false * client.123 *$' 2026-03-25T15:30:19.928 INFO:tasks.workunit.client.0.vm04.stdout:rbd_cache false client.123 2026-03-25T15:30:19.928 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config global list client.UNKNOWN 2026-03-25T15:30:19.928 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^rbd_cache * true * client *$' 2026-03-25T15:30:19.959 INFO:tasks.workunit.client.0.vm04.stdout:rbd_cache true client 2026-03-25T15:30:19.959 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config global rm client rbd_cache 2026-03-25T15:30:19.992 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd config global get client rbd_cache 2026-03-25T15:30:19.993 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config global get client rbd_cache 2026-03-25T15:30:20.020 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd_cache is not set 2026-03-25T15:30:20.022 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:30:20.023 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config global list client 2026-03-25T15:30:20.023 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^rbd_cache * false * global *$' 2026-03-25T15:30:20.051 INFO:tasks.workunit.client.0.vm04.stdout:rbd_cache false global 2026-03-25T15:30:20.051 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config global rm client.123 rbd_cache 2026-03-25T15:30:20.084 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config global rm global rbd_cache 2026-03-25T15:30:20.120 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config pool set rbd rbd_cache true 2026-03-25T15:30:20.163 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config pool list rbd 2026-03-25T15:30:20.163 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^rbd_cache * true * pool *$' 2026-03-25T15:30:20.194 INFO:tasks.workunit.client.0.vm04.stdout:rbd_cache true pool 2026-03-25T15:30:20.194 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config pool get rbd rbd_cache 2026-03-25T15:30:20.194 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^true$' 2026-03-25T15:30:20.226 INFO:tasks.workunit.client.0.vm04.stdout:true 2026-03-25T15:30:20.226 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create -s 1 test1 2026-03-25T15:30:20.255 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:30:20.254+0000 7fb1b495a300 -1 librbd: Forced V1 image creation. 2026-03-25T15:30:20.263 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config image list rbd/test1 2026-03-25T15:30:20.263 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^rbd_cache * true * pool *$' 2026-03-25T15:30:20.303 INFO:tasks.workunit.client.0.vm04.stdout:rbd_cache true pool 2026-03-25T15:30:20.303 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config image set rbd/test1 rbd_cache false 2026-03-25T15:30:20.353 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config image list rbd/test1 2026-03-25T15:30:20.353 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^rbd_cache * false * image *$' 2026-03-25T15:30:20.391 INFO:tasks.workunit.client.0.vm04.stdout:rbd_cache false image 2026-03-25T15:30:20.392 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config image get rbd/test1 rbd_cache 2026-03-25T15:30:20.392 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^false$' 2026-03-25T15:30:20.428 INFO:tasks.workunit.client.0.vm04.stdout:false 2026-03-25T15:30:20.429 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config image remove rbd/test1 rbd_cache 2026-03-25T15:30:20.469 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd config image get rbd/test1 rbd_cache 2026-03-25T15:30:20.469 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config image get rbd/test1 rbd_cache 2026-03-25T15:30:20.500 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd_cache is not set 2026-03-25T15:30:20.506 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:30:20.506 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config image list rbd/test1 2026-03-25T15:30:20.506 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^rbd_cache * true * pool *$' 2026-03-25T15:30:20.543 INFO:tasks.workunit.client.0.vm04.stdout:rbd_cache true pool 2026-03-25T15:30:20.545 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config pool remove rbd rbd_cache 2026-03-25T15:30:20.577 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd config pool get rbd rbd_cache 2026-03-25T15:30:20.577 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config pool get rbd rbd_cache 2026-03-25T15:30:20.605 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd_cache is not set 2026-03-25T15:30:20.608 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:30:20.608 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd config pool list rbd 2026-03-25T15:30:20.608 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^rbd_cache * true * config *$' 2026-03-25T15:30:20.638 INFO:tasks.workunit.client.0.vm04.stdout:rbd_cache true config 2026-03-25T15:30:20.638 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test1 2026-03-25T15:30:20.673 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:30:20.677 INFO:tasks.workunit.client.0.vm04.stderr:+ RBD_CREATE_ARGS= 2026-03-25T15:30:20.677 INFO:tasks.workunit.client.0.vm04.stderr:+ test_others 2026-03-25T15:30:20.677 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing import, export, resize, and snapshots...' 2026-03-25T15:30:20.677 INFO:tasks.workunit.client.0.vm04.stdout:testing import, export, resize, and snapshots... 2026-03-25T15:30:20.677 INFO:tasks.workunit.client.0.vm04.stderr:+ TMP_FILES='/tmp/img1 /tmp/img1.new /tmp/img2 /tmp/img2.new /tmp/img3 /tmp/img3.new /tmp/img-diff1.new /tmp/img-diff2.new /tmp/img-diff3.new /tmp/img1.snap1 /tmp/img1.snap1 /tmp/img-diff1.snap1' 2026-03-25T15:30:20.677 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-25T15:30:20.677 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:20.763 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:20.860 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:20.935 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:21.013 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:21.088 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:21.163 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:21.231 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:21.300 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:21.375 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:21.450 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:21.524 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:21.599 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:21.674 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:21.745 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:21.817 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:21.897 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:21.984 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:22.062 INFO:tasks.workunit.client.0.vm04.stderr:+ rm -f /tmp/img1 /tmp/img1.new /tmp/img2 /tmp/img2.new /tmp/img3 /tmp/img3.new /tmp/img-diff1.new /tmp/img-diff2.new /tmp/img-diff3.new /tmp/img1.snap1 /tmp/img1.snap1 /tmp/img-diff1.snap1 2026-03-25T15:30:22.063 INFO:tasks.workunit.client.0.vm04.stderr:+ dd if=/bin/sh of=/tmp/img1 bs=1k count=1 seek=10 2026-03-25T15:30:22.064 INFO:tasks.workunit.client.0.vm04.stderr:1+0 records in 2026-03-25T15:30:22.064 INFO:tasks.workunit.client.0.vm04.stderr:1+0 records out 2026-03-25T15:30:22.064 INFO:tasks.workunit.client.0.vm04.stderr:1024 bytes (1.0 kB, 1.0 KiB) copied, 4.5937e-05 s, 22.3 MB/s 2026-03-25T15:30:22.064 INFO:tasks.workunit.client.0.vm04.stderr:+ dd if=/bin/dd of=/tmp/img1 bs=1k count=10 seek=100 2026-03-25T15:30:22.065 INFO:tasks.workunit.client.0.vm04.stderr:10+0 records in 2026-03-25T15:30:22.065 INFO:tasks.workunit.client.0.vm04.stderr:10+0 records out 2026-03-25T15:30:22.065 INFO:tasks.workunit.client.0.vm04.stderr:10240 bytes (10 kB, 10 KiB) copied, 4.9793e-05 s, 206 MB/s 2026-03-25T15:30:22.065 INFO:tasks.workunit.client.0.vm04.stderr:+ dd if=/bin/rm of=/tmp/img1 bs=1k count=100 seek=1000 2026-03-25T15:30:22.066 INFO:tasks.workunit.client.0.vm04.stderr:59+1 records in 2026-03-25T15:30:22.066 INFO:tasks.workunit.client.0.vm04.stderr:59+1 records out 2026-03-25T15:30:22.066 INFO:tasks.workunit.client.0.vm04.stderr:61432 bytes (61 kB, 60 KiB) copied, 0.000136825 s, 449 MB/s 2026-03-25T15:30:22.066 INFO:tasks.workunit.client.0.vm04.stderr:+ dd if=/bin/ls of=/tmp/img1 bs=1k seek=10000 2026-03-25T15:30:22.067 INFO:tasks.workunit.client.0.vm04.stderr:137+1 records in 2026-03-25T15:30:22.067 INFO:tasks.workunit.client.0.vm04.stderr:137+1 records out 2026-03-25T15:30:22.067 INFO:tasks.workunit.client.0.vm04.stderr:140744 bytes (141 kB, 137 KiB) copied, 0.000285586 s, 493 MB/s 2026-03-25T15:30:22.067 INFO:tasks.workunit.client.0.vm04.stderr:+ dd if=/bin/ln of=/tmp/img1 bs=1k seek=100000 2026-03-25T15:30:22.068 INFO:tasks.workunit.client.0.vm04.stderr:55+1 records in 2026-03-25T15:30:22.068 INFO:tasks.workunit.client.0.vm04.stderr:55+1 records out 2026-03-25T15:30:22.068 INFO:tasks.workunit.client.0.vm04.stderr:57280 bytes (57 kB, 56 KiB) copied, 0.000136626 s, 419 MB/s 2026-03-25T15:30:22.068 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd import /tmp/img1 testimg1 2026-03-25T15:30:22.094 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:30:22.093+0000 7f6d4c637300 -1 librbd: Forced V1 image creation. 2026-03-25T15:30:22.355 INFO:tasks.workunit.client.0.vm04.stderr: Importing image: 4% complete... Importing image: 8% complete... Importing image: 12% complete... Importing image: 16% complete... Importing image: 20% complete... Importing image: 24% complete... Importing image: 28% complete... Importing image: 32% complete... Importing image: 36% complete... Importing image: 40% complete... Importing image: 45% complete... Importing image: 49% complete... Importing image: 53% complete... Importing image: 57% complete... Importing image: 61% complete... Importing image: 65% complete... Importing image: 69% complete... Importing image: 73% complete... Importing image: 77% complete... Importing image: 81% complete... Importing image: 85% complete... Importing image: 90% complete... Importing image: 94% complete... Importing image: 98% complete... Importing image: 100% complete...done. 2026-03-25T15:30:22.359 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd resize testimg1 --size=256 --allow-shrink 2026-03-25T15:30:22.411 INFO:tasks.workunit.client.0.vm04.stderr: Resizing image: 100% complete...done. 2026-03-25T15:30:22.416 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export testimg1 /tmp/img2 2026-03-25T15:30:22.543 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 1% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 12% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 23% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 37% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 48% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 62% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 73% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 87% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 98% complete... Exporting image: 100% complete...done. 2026-03-25T15:30:22.549 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create testimg1 --snap=snap1 2026-03-25T15:30:22.970 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-25T15:30:22.975 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd resize testimg1 --size=128 2026-03-25T15:30:23.009 INFO:tasks.workunit.client.0.vm04.stderr:rbd: shrinking an image is only allowed with the --allow-shrink flag 2026-03-25T15:30:23.014 INFO:tasks.workunit.client.0.vm04.stderr:+ true 2026-03-25T15:30:23.014 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd resize testimg1 --size=128 --allow-shrink 2026-03-25T15:30:23.060 INFO:tasks.workunit.client.0.vm04.stderr: Resizing image: 50% complete... Resizing image: 51% complete... Resizing image: 53% complete... Resizing image: 54% complete... Resizing image: 56% complete... Resizing image: 57% complete... Resizing image: 59% complete... Resizing image: 60% complete... Resizing image: 62% complete... Resizing image: 64% complete... Resizing image: 65% complete... Resizing image: 67% complete... Resizing image: 68% complete... Resizing image: 70% complete... Resizing image: 71% complete... Resizing image: 73% complete... Resizing image: 75% complete... Resizing image: 76% complete... Resizing image: 78% complete... Resizing image: 79% complete... Resizing image: 81% complete... Resizing image: 82% complete... Resizing image: 84% complete... Resizing image: 85% complete... Resizing image: 87% complete... Resizing image: 89% complete... Resizing image: 90% complete... Resizing image: 92% complete... Resizing image: 93% complete... Resizing image: 95% complete... Resizing image: 96% complete... Resizing image: 98% complete... Resizing image: 100% complete...done. 2026-03-25T15:30:23.065 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export testimg1 /tmp/img3 2026-03-25T15:30:23.148 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 3% complete... Exporting image: 6% complete... Exporting image: 9% complete... Exporting image: 12% complete... Exporting image: 15% complete... Exporting image: 18% complete... Exporting image: 21% complete... Exporting image: 25% complete... Exporting image: 28% complete... Exporting image: 31% complete... Exporting image: 34% complete... Exporting image: 37% complete... Exporting image: 40% complete... Exporting image: 43% complete... Exporting image: 46% complete... Exporting image: 50% complete... Exporting image: 53% complete... Exporting image: 56% complete... Exporting image: 59% complete... Exporting image: 62% complete... Exporting image: 65% complete... Exporting image: 68% complete... Exporting image: 71% complete... Exporting image: 75% complete... Exporting image: 78% complete... Exporting image: 81% complete... Exporting image: 84% complete... Exporting image: 87% complete... Exporting image: 90% complete... Exporting image: 93% complete... Exporting image: 96% complete... Exporting image: 100% complete...done. 2026-03-25T15:30:23.155 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg1 2026-03-25T15:30:23.155 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 128 MiB' 2026-03-25T15:30:23.186 INFO:tasks.workunit.client.0.vm04.stdout: size 128 MiB in 32 objects 2026-03-25T15:30:23.187 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info --snap=snap1 testimg1 2026-03-25T15:30:23.187 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 256 MiB' 2026-03-25T15:30:23.216 INFO:tasks.workunit.client.0.vm04.stdout: size 256 MiB in 64 objects 2026-03-25T15:30:23.217 INFO:tasks.workunit.client.0.vm04.stderr:+ rm -rf /tmp/diff-testimg1-1 /tmp/diff-testimg1-2 2026-03-25T15:30:23.217 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export-diff testimg1 --snap=snap1 /tmp/diff-testimg1-1 2026-03-25T15:30:23.252 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 3% complete... Exporting image: 37% complete... Exporting image: 100% complete...done. 2026-03-25T15:30:23.256 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export-diff testimg1 --from-snap=snap1 /tmp/diff-testimg1-2 2026-03-25T15:30:23.382 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 100% complete...done. 2026-03-25T15:30:23.386 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --size=1 testimg-diff1 2026-03-25T15:30:23.415 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:30:23.413+0000 7f6b924dc300 -1 librbd: Forced V1 image creation. 2026-03-25T15:30:23.430 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd import-diff --sparse-size 8K /tmp/diff-testimg1-1 testimg-diff1 2026-03-25T15:30:23.993 INFO:tasks.workunit.client.0.vm04.stderr: Importing image diff: 22% complete... Importing image diff: 63% complete... Importing image diff: 99% complete... Importing image diff: 100% complete...done. 2026-03-25T15:30:23.998 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd import-diff --sparse-size 8K /tmp/diff-testimg1-2 testimg-diff1 2026-03-25T15:30:24.047 INFO:tasks.workunit.client.0.vm04.stderr: Importing image diff: 68% complete... Importing image diff: 96% complete... Importing image diff: 100% complete...done. 2026-03-25T15:30:24.052 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg1 2026-03-25T15:30:24.052 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 128 MiB' 2026-03-25T15:30:24.083 INFO:tasks.workunit.client.0.vm04.stdout: size 128 MiB in 32 objects 2026-03-25T15:30:24.084 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info --snap=snap1 testimg1 2026-03-25T15:30:24.084 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 256 MiB' 2026-03-25T15:30:24.123 INFO:tasks.workunit.client.0.vm04.stdout: size 256 MiB in 64 objects 2026-03-25T15:30:24.123 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg-diff1 2026-03-25T15:30:24.123 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 128 MiB' 2026-03-25T15:30:24.156 INFO:tasks.workunit.client.0.vm04.stdout: size 128 MiB in 32 objects 2026-03-25T15:30:24.156 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info --snap=snap1 testimg-diff1 2026-03-25T15:30:24.156 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 256 MiB' 2026-03-25T15:30:24.188 INFO:tasks.workunit.client.0.vm04.stdout: size 256 MiB in 64 objects 2026-03-25T15:30:24.188 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd copy testimg1 --snap=snap1 testimg2 2026-03-25T15:30:24.255 INFO:tasks.workunit.client.0.vm04.stderr: Image copy: 1% complete... Image copy: 3% complete... Image copy: 4% complete... Image copy: 6% complete... Image copy: 7% complete... Image copy: 9% complete... Image copy: 10% complete... Image copy: 12% complete... Image copy: 14% complete... Image copy: 15% complete... Image copy: 17% complete... Image copy: 18% complete... Image copy: 20% complete... Image copy: 21% complete... Image copy: 23% complete... Image copy: 25% complete... Image copy: 26% complete... Image copy: 28% complete... Image copy: 29% complete... Image copy: 31% complete... Image copy: 32% complete... Image copy: 34% complete... Image copy: 35% complete... Image copy: 37% complete... Image copy: 39% complete... Image copy: 40% complete... Image copy: 42% complete... Image copy: 43% complete... Image copy: 45% complete... Image copy: 46% complete... Image copy: 48% complete... Image copy: 50% complete... Image copy: 51% complete... Image copy: 53% complete... Image copy: 54% complete... Image copy: 56% complete... Image copy: 57% complete... Image copy: 59% complete... Image copy: 60% complete... Image copy: 62% complete... Image copy: 64% complete... Image copy: 65% complete... Image copy: 67% complete... Image copy: 68% complete... Image copy: 70% complete... Image copy: 71% complete... Image copy: 73% complete... Image copy: 75% complete... Image copy: 76% complete... Image copy: 78% complete... Image copy: 79% complete... Image copy: 81% complete... Image copy: 82% complete... Image copy: 84% complete... Image copy: 85% complete... Image copy: 87% complete... Image copy: 89% complete... Image copy: 90% complete... Image copy: 92% complete... Image copy: 93% complete... Image copy: 95% complete... Image copy: 96% complete... Image copy: 98% complete... Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-25T15:30:24.260 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd copy testimg1 testimg3 2026-03-25T15:30:24.326 INFO:tasks.workunit.client.0.vm04.stderr: Image copy: 3% complete... Image copy: 6% complete... Image copy: 9% complete... Image copy: 12% complete... Image copy: 15% complete... Image copy: 18% complete... Image copy: 21% complete... Image copy: 25% complete... Image copy: 28% complete... Image copy: 31% complete... Image copy: 34% complete... Image copy: 37% complete... Image copy: 40% complete... Image copy: 43% complete... Image copy: 46% complete... Image copy: 50% complete... Image copy: 53% complete... Image copy: 56% complete... Image copy: 59% complete... Image copy: 62% complete... Image copy: 65% complete... Image copy: 68% complete... Image copy: 71% complete... Image copy: 75% complete... Image copy: 78% complete... Image copy: 81% complete... Image copy: 84% complete... Image copy: 87% complete... Image copy: 90% complete... Image copy: 93% complete... Image copy: 96% complete... Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-25T15:30:24.331 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd copy testimg-diff1 --sparse-size 768K --snap=snap1 testimg-diff2 2026-03-25T15:30:24.601 INFO:tasks.workunit.client.0.vm04.stderr: Image copy: 1% complete... Image copy: 3% complete... Image copy: 4% complete... Image copy: 6% complete... Image copy: 7% complete... Image copy: 9% complete... Image copy: 10% complete... Image copy: 12% complete... Image copy: 14% complete... Image copy: 15% complete... Image copy: 17% complete... Image copy: 18% complete... Image copy: 20% complete... Image copy: 21% complete... Image copy: 23% complete... Image copy: 25% complete... Image copy: 26% complete... Image copy: 28% complete... Image copy: 29% complete... Image copy: 31% complete... Image copy: 32% complete... Image copy: 34% complete... Image copy: 35% complete... Image copy: 37% complete... Image copy: 39% complete... Image copy: 40% complete... Image copy: 42% complete... Image copy: 43% complete... Image copy: 45% complete... Image copy: 46% complete... Image copy: 48% complete... Image copy: 50% complete... Image copy: 51% complete... Image copy: 53% complete... Image copy: 54% complete... Image copy: 56% complete... Image copy: 57% complete... Image copy: 59% complete... Image copy: 60% complete... Image copy: 62% complete... Image copy: 64% complete... Image copy: 65% complete... Image copy: 67% complete... Image copy: 68% complete... Image copy: 70% complete... Image copy: 71% complete... Image copy: 73% complete... Image copy: 75% complete... Image copy: 76% complete... Image copy: 78% complete... Image copy: 79% complete... Image copy: 81% complete... Image copy: 82% complete... Image copy: 84% complete... Image copy: 85% complete... Image copy: 87% complete... Image copy: 89% complete... Image copy: 90% complete... Image copy: 92% complete... Image copy: 93% complete... Image copy: 95% complete... Image copy: 96% complete... Image copy: 98% complete... Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-25T15:30:24.606 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd copy testimg-diff1 --sparse-size 768K testimg-diff3 2026-03-25T15:30:24.722 INFO:tasks.workunit.client.0.vm04.stderr: Image copy: 3% complete... Image copy: 6% complete... Image copy: 9% complete... Image copy: 12% complete... Image copy: 15% complete... Image copy: 18% complete... Image copy: 21% complete... Image copy: 25% complete... Image copy: 28% complete... Image copy: 31% complete... Image copy: 34% complete... Image copy: 37% complete... Image copy: 40% complete... Image copy: 43% complete... Image copy: 46% complete... Image copy: 50% complete... Image copy: 53% complete... Image copy: 56% complete... Image copy: 59% complete... Image copy: 62% complete... Image copy: 65% complete... Image copy: 68% complete... Image copy: 71% complete... Image copy: 75% complete... Image copy: 78% complete... Image copy: 81% complete... Image copy: 84% complete... Image copy: 87% complete... Image copy: 90% complete... Image copy: 93% complete... Image copy: 96% complete... Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-25T15:30:24.729 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg2 2026-03-25T15:30:24.729 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 256 MiB' 2026-03-25T15:30:24.762 INFO:tasks.workunit.client.0.vm04.stdout: size 256 MiB in 64 objects 2026-03-25T15:30:24.763 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg3 2026-03-25T15:30:24.763 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 128 MiB' 2026-03-25T15:30:24.796 INFO:tasks.workunit.client.0.vm04.stdout: size 128 MiB in 32 objects 2026-03-25T15:30:24.796 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg-diff2 2026-03-25T15:30:24.796 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 256 MiB' 2026-03-25T15:30:24.827 INFO:tasks.workunit.client.0.vm04.stdout: size 256 MiB in 64 objects 2026-03-25T15:30:24.828 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg-diff3 2026-03-25T15:30:24.828 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 128 MiB' 2026-03-25T15:30:24.861 INFO:tasks.workunit.client.0.vm04.stdout: size 128 MiB in 32 objects 2026-03-25T15:30:24.861 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd deep copy testimg1 testimg4 2026-03-25T15:30:25.018 INFO:tasks.workunit.client.0.vm04.stderr: Image deep copy: 1% complete... Image deep copy: 3% complete... Image deep copy: 4% complete... Image deep copy: 6% complete... Image deep copy: 7% complete... Image deep copy: 9% complete... Image deep copy: 10% complete... Image deep copy: 12% complete... Image deep copy: 14% complete... Image deep copy: 15% complete... Image deep copy: 17% complete... Image deep copy: 18% complete... Image deep copy: 20% complete... Image deep copy: 21% complete... Image deep copy: 23% complete... Image deep copy: 25% complete... Image deep copy: 26% complete... Image deep copy: 28% complete... Image deep copy: 29% complete... Image deep copy: 31% complete... Image deep copy: 32% complete... Image deep copy: 34% complete... Image deep copy: 35% complete... Image deep copy: 37% complete... Image deep copy: 39% complete... Image deep copy: 40% complete... Image deep copy: 42% complete... Image deep copy: 43% complete... Image deep copy: 45% complete... Image deep copy: 46% complete... Image deep copy: 48% complete... Image deep copy: 50% complete... Image deep copy: 51% complete... Image deep copy: 53% complete... Image deep copy: 54% complete... Image deep copy: 56% complete... Image deep copy: 57% complete... Image deep copy: 59% complete... Image deep copy: 60% complete... Image deep copy: 62% complete... Image deep copy: 64% complete... Image deep copy: 65% complete... Image deep copy: 67% complete... Image deep copy: 68% complete... Image deep copy: 70% complete... Image deep copy: 71% complete... Image deep copy: 73% complete... Image deep copy: 75% complete... Image deep copy: 76% complete... Image deep copy: 78% complete... Image deep copy: 79% complete... Image deep copy: 81% complete... Image deep copy: 82% complete... Image deep copy: 84% complete... Image deep copy: 85% complete... Image deep copy: 87% complete... Image deep copy: 89% complete... Image deep copy: 90% complete... Image deep copy: 92% complete... Image deep copy: 93% complete... Image deep copy: 95% complete... Image deep copy: 96% complete... Image deep copy: 98% complete... Image deep copy: 100% complete... Image deep copy: 100% complete...done. 2026-03-25T15:30:25.022 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd deep copy testimg1 --snap=snap1 testimg5 2026-03-25T15:30:26.013 INFO:tasks.workunit.client.0.vm04.stderr: Image deep copy: 1% complete... Image deep copy: 3% complete... Image deep copy: 4% complete... Image deep copy: 6% complete... Image deep copy: 7% complete... Image deep copy: 9% complete... Image deep copy: 10% complete... Image deep copy: 12% complete... Image deep copy: 14% complete... Image deep copy: 15% complete... Image deep copy: 17% complete... Image deep copy: 18% complete... Image deep copy: 20% complete... Image deep copy: 21% complete... Image deep copy: 23% complete... Image deep copy: 25% complete... Image deep copy: 26% complete... Image deep copy: 28% complete... Image deep copy: 29% complete... Image deep copy: 31% complete... Image deep copy: 32% complete... Image deep copy: 34% complete... Image deep copy: 35% complete... Image deep copy: 37% complete... Image deep copy: 39% complete... Image deep copy: 40% complete... Image deep copy: 42% complete... Image deep copy: 43% complete... Image deep copy: 45% complete... Image deep copy: 46% complete... Image deep copy: 48% complete... Image deep copy: 50% complete... Image deep copy: 51% complete... Image deep copy: 53% complete... Image deep copy: 54% complete... Image deep copy: 56% complete... Image deep copy: 57% complete... Image deep copy: 59% complete... Image deep copy: 60% complete... Image deep copy: 62% complete... Image deep copy: 64% complete... Image deep copy: 65% complete... Image deep copy: 67% complete... Image deep copy: 68% complete... Image deep copy: 70% complete... Image deep copy: 71% complete... Image deep copy: 73% complete... Image deep copy: 75% complete... Image deep copy: 76% complete... Image deep copy: 78% complete... Image deep copy: 79% complete... Image deep copy: 81% complete... Image deep copy: 82% complete... Image deep copy: 84% complete... Image deep copy: 85% complete... Image deep copy: 87% complete... Image deep copy: 89% complete... Image deep copy: 90% complete... Image deep copy: 92% complete... Image deep copy: 93% complete... Image deep copy: 95% complete... Image deep copy: 96% complete... Image deep copy: 98% complete... Image deep copy: 100% complete... Image deep copy: 100% complete...done. 2026-03-25T15:30:26.017 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg4 2026-03-25T15:30:26.017 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 128 MiB' 2026-03-25T15:30:26.049 INFO:tasks.workunit.client.0.vm04.stdout: size 128 MiB in 32 objects 2026-03-25T15:30:26.050 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg5 2026-03-25T15:30:26.050 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 256 MiB' 2026-03-25T15:30:26.082 INFO:tasks.workunit.client.0.vm04.stdout: size 256 MiB in 64 objects 2026-03-25T15:30:26.082 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap ls testimg4 2026-03-25T15:30:26.082 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -v SNAPID 2026-03-25T15:30:26.082 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 1 2026-03-25T15:30:26.082 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:30:26.114 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-25T15:30:26.115 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap ls testimg4 2026-03-25T15:30:26.115 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '.*snap1.*' 2026-03-25T15:30:26.147 INFO:tasks.workunit.client.0.vm04.stdout: 11 snap1 256 MiB Wed Mar 25 15:30:25 2026 2026-03-25T15:30:26.147 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export testimg1 /tmp/img1.new 2026-03-25T15:30:26.208 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 3% complete... Exporting image: 6% complete... Exporting image: 9% complete... Exporting image: 12% complete... Exporting image: 15% complete... Exporting image: 18% complete... Exporting image: 21% complete... Exporting image: 25% complete... Exporting image: 28% complete... Exporting image: 31% complete... Exporting image: 34% complete... Exporting image: 37% complete... Exporting image: 40% complete... Exporting image: 43% complete... Exporting image: 46% complete... Exporting image: 50% complete... Exporting image: 53% complete... Exporting image: 56% complete... Exporting image: 59% complete... Exporting image: 62% complete... Exporting image: 65% complete... Exporting image: 68% complete... Exporting image: 71% complete... Exporting image: 75% complete... Exporting image: 78% complete... Exporting image: 81% complete... Exporting image: 84% complete... Exporting image: 87% complete... Exporting image: 90% complete... Exporting image: 93% complete... Exporting image: 96% complete... Exporting image: 100% complete...done. 2026-03-25T15:30:26.213 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export testimg2 /tmp/img2.new 2026-03-25T15:30:26.320 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 1% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 12% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 23% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 37% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 48% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 62% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 73% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 87% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 98% complete... Exporting image: 100% complete...done. 2026-03-25T15:30:26.326 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export testimg3 /tmp/img3.new 2026-03-25T15:30:26.378 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 3% complete... Exporting image: 6% complete... Exporting image: 9% complete... Exporting image: 12% complete... Exporting image: 15% complete... Exporting image: 18% complete... Exporting image: 21% complete... Exporting image: 25% complete... Exporting image: 28% complete... Exporting image: 31% complete... Exporting image: 34% complete... Exporting image: 37% complete... Exporting image: 40% complete... Exporting image: 43% complete... Exporting image: 46% complete... Exporting image: 50% complete... Exporting image: 53% complete... Exporting image: 56% complete... Exporting image: 59% complete... Exporting image: 62% complete... Exporting image: 65% complete... Exporting image: 68% complete... Exporting image: 71% complete... Exporting image: 75% complete... Exporting image: 78% complete... Exporting image: 81% complete... Exporting image: 84% complete... Exporting image: 87% complete... Exporting image: 90% complete... Exporting image: 93% complete... Exporting image: 96% complete... Exporting image: 100% complete...done. 2026-03-25T15:30:26.384 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export testimg-diff1 /tmp/img-diff1.new 2026-03-25T15:30:26.440 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 3% complete... Exporting image: 6% complete... Exporting image: 9% complete... Exporting image: 12% complete... Exporting image: 15% complete... Exporting image: 18% complete... Exporting image: 21% complete... Exporting image: 25% complete... Exporting image: 28% complete... Exporting image: 31% complete... Exporting image: 34% complete... Exporting image: 37% complete... Exporting image: 40% complete... Exporting image: 43% complete... Exporting image: 46% complete... Exporting image: 50% complete... Exporting image: 53% complete... Exporting image: 56% complete... Exporting image: 59% complete... Exporting image: 62% complete... Exporting image: 65% complete... Exporting image: 68% complete... Exporting image: 71% complete... Exporting image: 75% complete... Exporting image: 78% complete... Exporting image: 81% complete... Exporting image: 84% complete... Exporting image: 87% complete... Exporting image: 90% complete... Exporting image: 93% complete... Exporting image: 96% complete... Exporting image: 100% complete...done. 2026-03-25T15:30:26.445 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export testimg-diff2 /tmp/img-diff2.new 2026-03-25T15:30:26.573 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 1% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 12% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 23% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 37% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 48% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 62% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 73% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 87% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 98% complete... Exporting image: 100% complete...done. 2026-03-25T15:30:26.579 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export testimg-diff3 /tmp/img-diff3.new 2026-03-25T15:30:26.658 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 3% complete... Exporting image: 6% complete... Exporting image: 9% complete... Exporting image: 12% complete... Exporting image: 15% complete... Exporting image: 18% complete... Exporting image: 21% complete... Exporting image: 25% complete... Exporting image: 28% complete... Exporting image: 31% complete... Exporting image: 34% complete... Exporting image: 37% complete... Exporting image: 40% complete... Exporting image: 43% complete... Exporting image: 46% complete... Exporting image: 50% complete... Exporting image: 53% complete... Exporting image: 56% complete... Exporting image: 59% complete... Exporting image: 62% complete... Exporting image: 65% complete... Exporting image: 68% complete... Exporting image: 71% complete... Exporting image: 75% complete... Exporting image: 78% complete... Exporting image: 81% complete... Exporting image: 84% complete... Exporting image: 87% complete... Exporting image: 90% complete... Exporting image: 93% complete... Exporting image: 96% complete... Exporting image: 100% complete...done. 2026-03-25T15:30:26.663 INFO:tasks.workunit.client.0.vm04.stderr:+ cmp /tmp/img2 /tmp/img2.new 2026-03-25T15:30:27.729 INFO:tasks.workunit.client.0.vm04.stderr:+ cmp /tmp/img3 /tmp/img3.new 2026-03-25T15:30:28.124 INFO:tasks.workunit.client.0.vm04.stderr:+ cmp /tmp/img2 /tmp/img-diff2.new 2026-03-25T15:30:28.884 INFO:tasks.workunit.client.0.vm04.stderr:+ cmp /tmp/img3 /tmp/img-diff3.new 2026-03-25T15:30:29.168 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rollback --snap=snap1 testimg1 2026-03-25T15:30:29.272 INFO:tasks.workunit.client.0.vm04.stderr: Rolling back to snapshot: 1% complete... Rolling back to snapshot: 3% complete... Rolling back to snapshot: 4% complete... Rolling back to snapshot: 6% complete... Rolling back to snapshot: 7% complete... Rolling back to snapshot: 9% complete... Rolling back to snapshot: 10% complete... Rolling back to snapshot: 12% complete... Rolling back to snapshot: 14% complete... Rolling back to snapshot: 15% complete... Rolling back to snapshot: 17% complete... Rolling back to snapshot: 18% complete... Rolling back to snapshot: 20% complete... Rolling back to snapshot: 21% complete... Rolling back to snapshot: 23% complete... Rolling back to snapshot: 25% complete... Rolling back to snapshot: 26% complete... Rolling back to snapshot: 28% complete... Rolling back to snapshot: 29% complete... Rolling back to snapshot: 31% complete... Rolling back to snapshot: 32% complete... Rolling back to snapshot: 34% complete... Rolling back to snapshot: 35% complete... Rolling back to snapshot: 37% complete... Rolling back to snapshot: 39% complete... Rolling back to snapshot: 40% complete... Rolling back to snapshot: 42% complete... Rolling back to snapshot: 43% complete... Rolling back to snapshot: 45% complete... Rolling back to snapshot: 46% complete... Rolling back to snapshot: 48% complete... Rolling back to snapshot: 50% complete... Rolling back to snapshot: 51% complete... Rolling back to snapshot: 53% complete... Rolling back to snapshot: 54% complete... Rolling back to snapshot: 56% complete... Rolling back to snapshot: 57% complete... Rolling back to snapshot: 59% complete... Rolling back to snapshot: 60% complete... Rolling back to snapshot: 62% complete... Rolling back to snapshot: 64% complete... Rolling back to snapshot: 65% complete... Rolling back to snapshot: 67% complete... Rolling back to snapshot: 68% complete... Rolling back to snapshot: 70% complete... Rolling back to snapshot: 71% complete... Rolling back to snapshot: 73% complete... Rolling back to snapshot: 75% complete... Rolling back to snapshot: 76% complete... Rolling back to snapshot: 78% complete... Rolling back to snapshot: 79% complete... Rolling back to snapshot: 81% complete... Rolling back to snapshot: 82% complete... Rolling back to snapshot: 84% complete... Rolling back to snapshot: 85% complete... Rolling back to snapshot: 87% complete... Rolling back to snapshot: 89% complete... Rolling back to snapshot: 90% complete... Rolling back to snapshot: 92% complete... Rolling back to snapshot: 93% complete... Rolling back to snapshot: 95% complete... Rolling back to snapshot: 96% complete... Rolling back to snapshot: 98% complete... Rolling back to snapshot: 100% complete...done. 2026-03-25T15:30:29.278 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rollback --snap=snap1 testimg-diff1 2026-03-25T15:30:29.362 INFO:tasks.workunit.client.0.vm04.stderr: Rolling back to snapshot: 1% complete... Rolling back to snapshot: 3% complete... Rolling back to snapshot: 4% complete... Rolling back to snapshot: 6% complete... Rolling back to snapshot: 7% complete... Rolling back to snapshot: 9% complete... Rolling back to snapshot: 10% complete... Rolling back to snapshot: 12% complete... Rolling back to snapshot: 14% complete... Rolling back to snapshot: 15% complete... Rolling back to snapshot: 17% complete... Rolling back to snapshot: 18% complete... Rolling back to snapshot: 20% complete... Rolling back to snapshot: 21% complete... Rolling back to snapshot: 23% complete... Rolling back to snapshot: 25% complete... Rolling back to snapshot: 26% complete... Rolling back to snapshot: 28% complete... Rolling back to snapshot: 29% complete... Rolling back to snapshot: 31% complete... Rolling back to snapshot: 32% complete... Rolling back to snapshot: 34% complete... Rolling back to snapshot: 35% complete... Rolling back to snapshot: 37% complete... Rolling back to snapshot: 39% complete... Rolling back to snapshot: 40% complete... Rolling back to snapshot: 42% complete... Rolling back to snapshot: 43% complete... Rolling back to snapshot: 45% complete... Rolling back to snapshot: 46% complete... Rolling back to snapshot: 48% complete... Rolling back to snapshot: 50% complete... Rolling back to snapshot: 51% complete... Rolling back to snapshot: 53% complete... Rolling back to snapshot: 54% complete... Rolling back to snapshot: 56% complete... Rolling back to snapshot: 57% complete... Rolling back to snapshot: 59% complete... Rolling back to snapshot: 60% complete... Rolling back to snapshot: 62% complete... Rolling back to snapshot: 64% complete... Rolling back to snapshot: 65% complete... Rolling back to snapshot: 67% complete... Rolling back to snapshot: 68% complete... Rolling back to snapshot: 70% complete... Rolling back to snapshot: 71% complete... Rolling back to snapshot: 73% complete... Rolling back to snapshot: 75% complete... Rolling back to snapshot: 76% complete... Rolling back to snapshot: 78% complete... Rolling back to snapshot: 79% complete... Rolling back to snapshot: 81% complete... Rolling back to snapshot: 82% complete... Rolling back to snapshot: 84% complete... Rolling back to snapshot: 85% complete... Rolling back to snapshot: 87% complete... Rolling back to snapshot: 89% complete... Rolling back to snapshot: 90% complete... Rolling back to snapshot: 92% complete... Rolling back to snapshot: 93% complete... Rolling back to snapshot: 95% complete... Rolling back to snapshot: 96% complete... Rolling back to snapshot: 98% complete... Rolling back to snapshot: 100% complete...done. 2026-03-25T15:30:29.368 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg1 2026-03-25T15:30:29.368 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 256 MiB' 2026-03-25T15:30:29.399 INFO:tasks.workunit.client.0.vm04.stdout: size 256 MiB in 64 objects 2026-03-25T15:30:29.402 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg-diff1 2026-03-25T15:30:29.402 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 256 MiB' 2026-03-25T15:30:29.431 INFO:tasks.workunit.client.0.vm04.stdout: size 256 MiB in 64 objects 2026-03-25T15:30:29.431 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export testimg1 /tmp/img1.snap1 2026-03-25T15:30:29.564 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 1% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 12% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 23% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 37% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 48% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 62% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 73% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 87% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 98% complete... Exporting image: 100% complete...done. 2026-03-25T15:30:29.568 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export testimg-diff1 /tmp/img-diff1.snap1 2026-03-25T15:30:29.726 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 1% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 12% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 23% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 37% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 48% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 62% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 73% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 87% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 98% complete... Exporting image: 100% complete...done. 2026-03-25T15:30:29.730 INFO:tasks.workunit.client.0.vm04.stderr:+ cmp /tmp/img2 /tmp/img1.snap1 2026-03-25T15:30:30.178 INFO:tasks.workunit.client.0.vm04.stderr:+ cmp /tmp/img2 /tmp/img-diff1.snap1 2026-03-25T15:30:30.418 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm testimg2 2026-03-25T15:30:30.495 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 1% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 12% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 23% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 37% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 48% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 62% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 73% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 87% complete... Removing image: 89% complete... Removing image: 90% complete...2026-03-25T15:30:30.493+0000 7ff840f23640 0 -- 192.168.123.104:0/1229311580 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x558601a0c9b0 msgr2=0x558601b14f10 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:30:30.502 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 92% complete... Removing image: 93% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 98% complete...2026-03-25T15:30:30.501+0000 7ff840f23640 0 -- 192.168.123.104:0/1229311580 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7ff82005ee50 msgr2=0x7ff82007f230 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-25T15:30:30.506 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:30:30.510 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm testimg3 2026-03-25T15:30:30.587 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete... Removing image: 100% complete...done. 2026-03-25T15:30:30.590 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create testimg2 -s 0 2026-03-25T15:30:30.617 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:30:30.615+0000 7f5e3eedc300 -1 librbd: Forced V1 image creation. 2026-03-25T15:30:30.624 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd cp testimg2 testimg3 2026-03-25T15:30:30.662 INFO:tasks.workunit.client.0.vm04.stderr: Image copy: 100% complete...done. 2026-03-25T15:30:30.665 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd deep cp testimg2 testimg6 2026-03-25T15:30:30.707 INFO:tasks.workunit.client.0.vm04.stderr: Image deep copy: 100% complete...done. 2026-03-25T15:30:30.710 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm --snap=snap1 testimg1 2026-03-25T15:30:31.026 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-25T15:30:31.031 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm --snap=snap1 testimg-diff1 2026-03-25T15:30:32.029 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-25T15:30:32.033 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info --snap=snap1 testimg1 2026-03-25T15:30:32.033 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'error setting snapshot context: (2) No such file or directory' 2026-03-25T15:30:32.065 INFO:tasks.workunit.client.0.vm04.stdout:error setting snapshot context: (2) No such file or directory 2026-03-25T15:30:32.065 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info --snap=snap1 testimg-diff1 2026-03-25T15:30:32.065 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'error setting snapshot context: (2) No such file or directory' 2026-03-25T15:30:32.094 INFO:tasks.workunit.client.0.vm04.stdout:error setting snapshot context: (2) No such file or directory 2026-03-25T15:30:32.094 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd sparsify testimg1 2026-03-25T15:30:32.154 INFO:tasks.workunit.client.0.vm04.stderr: Image sparsify: 1% complete... Image sparsify: 3% complete... Image sparsify: 4% complete... Image sparsify: 6% complete... Image sparsify: 7% complete... Image sparsify: 9% complete... Image sparsify: 10% complete... Image sparsify: 12% complete... Image sparsify: 14% complete... Image sparsify: 15% complete... Image sparsify: 17% complete... Image sparsify: 18% complete... Image sparsify: 20% complete... Image sparsify: 21% complete... Image sparsify: 23% complete... Image sparsify: 25% complete... Image sparsify: 26% complete... Image sparsify: 28% complete... Image sparsify: 29% complete... Image sparsify: 31% complete... Image sparsify: 32% complete... Image sparsify: 34% complete... Image sparsify: 35% complete... Image sparsify: 37% complete... Image sparsify: 39% complete... Image sparsify: 40% complete... Image sparsify: 42% complete... Image sparsify: 43% complete... Image sparsify: 45% complete... Image sparsify: 46% complete... Image sparsify: 48% complete... Image sparsify: 50% complete... Image sparsify: 51% complete... Image sparsify: 53% complete... Image sparsify: 54% complete... Image sparsify: 56% complete... Image sparsify: 57% complete... Image sparsify: 59% complete... Image sparsify: 60% complete... Image sparsify: 62% complete... Image sparsify: 64% complete... Image sparsify: 65% complete... Image sparsify: 67% complete... Image sparsify: 68% complete... Image sparsify: 70% complete... Image sparsify: 71% complete... Image sparsify: 73% complete... Image sparsify: 75% complete... Image sparsify: 76% complete... Image sparsify: 78% complete... Image sparsify: 79% complete... Image sparsify: 81% complete... Image sparsify: 82% complete... Image sparsify: 84% complete... Image sparsify: 85% complete... Image sparsify: 87% complete... Image sparsify: 89% complete... Image sparsify: 90% complete... Image sparsify: 92% complete... Image sparsify: 93% complete... Image sparsify: 95% complete... Image sparsify: 96% complete... Image sparsify: 98% complete... Image sparsify: 100% complete...done. 2026-03-25T15:30:32.159 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-25T15:30:32.159 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:32.254 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:32.324 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:32.424 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:33.110 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:34.139 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:34.260 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:34.352 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:34.494 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:34.603 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:34.670 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:34.740 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:34.813 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:34.886 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:34.959 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:35.031 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:35.103 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:35.177 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:35.248 INFO:tasks.workunit.client.0.vm04.stderr:+ rm -f /tmp/img1 /tmp/img1.new /tmp/img2 /tmp/img2.new /tmp/img3 /tmp/img3.new /tmp/img-diff1.new /tmp/img-diff2.new /tmp/img-diff3.new /tmp/img1.snap1 /tmp/img1.snap1 /tmp/img-diff1.snap1 2026-03-25T15:30:35.376 INFO:tasks.workunit.client.0.vm04.stderr:+ test_locking 2026-03-25T15:30:35.376 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing locking...' 2026-03-25T15:30:35.376 INFO:tasks.workunit.client.0.vm04.stdout:testing locking... 2026-03-25T15:30:35.376 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-25T15:30:35.376 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:35.449 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:35.514 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:35.581 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:35.649 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:35.721 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:35.795 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:35.864 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:35.936 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:36.014 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:36.093 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:36.170 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:36.245 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:36.330 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:36.413 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:36.492 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:36.574 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:36.649 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:36.729 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create -s 1 test1 2026-03-25T15:30:36.754 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:30:36.752+0000 7faa72c0d300 -1 librbd: Forced V1 image creation. 2026-03-25T15:30:36.762 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock list test1 2026-03-25T15:30:36.762 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:30:36.762 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^0$' 2026-03-25T15:30:36.793 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-25T15:30:36.793 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock add test1 id 2026-03-25T15:30:36.828 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock list test1 2026-03-25T15:30:36.829 INFO:tasks.workunit.client.0.vm04.stderr:+ grep ' 1 ' 2026-03-25T15:30:36.857 INFO:tasks.workunit.client.0.vm04.stdout:There is 1 exclusive lock on this image. 2026-03-25T15:30:36.857 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd lock list test1 2026-03-25T15:30:36.857 INFO:tasks.workunit.client.0.vm04.stderr:++ tail -n 1 2026-03-25T15:30:36.858 INFO:tasks.workunit.client.0.vm04.stderr:++ awk '{print $1;}' 2026-03-25T15:30:36.890 INFO:tasks.workunit.client.0.vm04.stderr:+ LOCKER=client.7670 2026-03-25T15:30:36.890 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock remove test1 id client.7670 2026-03-25T15:30:37.707 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock list test1 2026-03-25T15:30:37.707 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:30:37.707 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^0$' 2026-03-25T15:30:37.736 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-25T15:30:37.736 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock add test1 id --shared tag 2026-03-25T15:30:37.774 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock list test1 2026-03-25T15:30:37.774 INFO:tasks.workunit.client.0.vm04.stderr:+ grep ' 1 ' 2026-03-25T15:30:37.802 INFO:tasks.workunit.client.0.vm04.stdout:There is 1 shared lock on this image. 2026-03-25T15:30:37.802 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock add test1 id --shared tag 2026-03-25T15:30:37.838 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock list test1 2026-03-25T15:30:37.838 INFO:tasks.workunit.client.0.vm04.stderr:+ grep ' 2 ' 2026-03-25T15:30:37.867 INFO:tasks.workunit.client.0.vm04.stdout:There are 2 shared locks on this image. 2026-03-25T15:30:37.867 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock add test1 id2 --shared tag 2026-03-25T15:30:37.899 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock list test1 2026-03-25T15:30:37.899 INFO:tasks.workunit.client.0.vm04.stderr:+ grep ' 3 ' 2026-03-25T15:30:37.930 INFO:tasks.workunit.client.0.vm04.stdout:There are 3 shared locks on this image. 2026-03-25T15:30:37.930 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock list test1 2026-03-25T15:30:37.930 INFO:tasks.workunit.client.0.vm04.stderr:+ tail -n 1 2026-03-25T15:30:37.930 INFO:tasks.workunit.client.0.vm04.stderr:+ awk '{print $2, $1;}' 2026-03-25T15:30:37.930 INFO:tasks.workunit.client.0.vm04.stderr:+ xargs rbd lock remove test1 2026-03-25T15:30:38.713 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info test1 2026-03-25T15:30:38.713 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -qE 'features:.*exclusive' 2026-03-25T15:30:38.743 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test1 2026-03-25T15:30:38.782 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:30:38.785 INFO:tasks.workunit.client.0.vm04.stderr:+ test_thick_provision 2026-03-25T15:30:38.785 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing thick provision...' 2026-03-25T15:30:38.785 INFO:tasks.workunit.client.0.vm04.stdout:testing thick provision... 2026-03-25T15:30:38.786 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-25T15:30:38.786 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:38.865 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:38.939 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:39.023 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:39.097 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:39.168 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:39.234 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:39.301 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:39.367 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:39.446 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:39.514 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:39.586 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:39.657 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:39.732 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:39.806 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:39.887 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:39.960 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:40.030 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:40.106 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --thick-provision -s 64M test1 2026-03-25T15:30:40.131 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:30:40.130+0000 7f395dbff300 -1 librbd: Forced V1 image creation. 2026-03-25T15:30:40.397 INFO:tasks.workunit.client.0.vm04.stderr: Thick provisioning: 6% complete... Thick provisioning: 12% complete... Thick provisioning: 18% complete... Thick provisioning: 25% complete... Thick provisioning: 31% complete... Thick provisioning: 37% complete... Thick provisioning: 43% complete... Thick provisioning: 50% complete... Thick provisioning: 56% complete... Thick provisioning: 62% complete... Thick provisioning: 68% complete... Thick provisioning: 75% complete... Thick provisioning: 81% complete... Thick provisioning: 87% complete... Thick provisioning: 93% complete... Thick provisioning: 100% complete... Thick provisioning: 100% complete...done. 2026-03-25T15:30:40.407 INFO:tasks.workunit.client.0.vm04.stderr:+ count=0 2026-03-25T15:30:40.407 INFO:tasks.workunit.client.0.vm04.stderr:+ ret= 2026-03-25T15:30:40.407 INFO:tasks.workunit.client.0.vm04.stderr:+ '[' 0 -lt 10 ']' 2026-03-25T15:30:40.407 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd du 2026-03-25T15:30:40.407 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test1 2026-03-25T15:30:40.407 INFO:tasks.workunit.client.0.vm04.stderr:+ tr -s ' ' 2026-03-25T15:30:40.407 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^64 MiB' 2026-03-25T15:30:40.407 INFO:tasks.workunit.client.0.vm04.stderr:+ cut -d ' ' -f 4-5 2026-03-25T15:30:40.435 INFO:tasks.workunit.client.0.vm04.stderr:warning: fast-diff map is not enabled for test1. operation may be slow. 2026-03-25T15:30:40.443 INFO:tasks.workunit.client.0.vm04.stdout:64 MiB 2026-03-25T15:30:40.443 INFO:tasks.workunit.client.0.vm04.stderr:+ ret=0 2026-03-25T15:30:40.443 INFO:tasks.workunit.client.0.vm04.stderr:+ '[' 0 = 0 ']' 2026-03-25T15:30:40.443 INFO:tasks.workunit.client.0.vm04.stderr:+ break 2026-03-25T15:30:40.443 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd du 2026-03-25T15:30:40.475 INFO:tasks.workunit.client.0.vm04.stderr:warning: fast-diff map is not enabled for test1. operation may be slow. 2026-03-25T15:30:40.478 INFO:tasks.workunit.client.0.vm04.stdout:NAME PROVISIONED USED 2026-03-25T15:30:40.478 INFO:tasks.workunit.client.0.vm04.stdout:test1 64 MiB 64 MiB 2026-03-25T15:30:40.482 INFO:tasks.workunit.client.0.vm04.stderr:+ '[' 0 '!=' 0 ']' 2026-03-25T15:30:40.482 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test1 2026-03-25T15:30:40.539 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 6% complete... Removing image: 12% complete... Removing image: 18% complete... Removing image: 25% complete... Removing image: 31% complete... Removing image: 37% complete... Removing image: 43% complete... Removing image: 50% complete... Removing image: 56% complete... Removing image: 62% complete... Removing image: 68% complete... Removing image: 75% complete... Removing image: 81% complete... Removing image: 87% complete... Removing image: 93% complete... Removing image: 100% complete...done. 2026-03-25T15:30:40.543 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-25T15:30:40.543 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test1 2026-03-25T15:30:40.543 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:30:40.543 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^0$' 2026-03-25T15:30:40.571 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-25T15:30:40.571 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --thick-provision -s 4G test1 2026-03-25T15:30:40.601 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:30:40.599+0000 7fa168a57300 -1 librbd: Forced V1 image creation. 2026-03-25T15:30:42.033 INFO:tasks.workunit.client.0.vm04.stderr: Thick provisioning: 1% complete... Thick provisioning: 2% complete... Thick provisioning: 3% complete... Thick provisioning: 4% complete... Thick provisioning: 5% complete... Thick provisioning: 6% complete... Thick provisioning: 7% complete... Thick provisioning: 8% complete... Thick provisioning: 9% complete... Thick provisioning: 10% complete...2026-03-25T15:30:42.031+0000 7fa16750c640 0 -- 192.168.123.104:0/2609957404 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x7fa14000f110 msgr2=0x7fa14002f550 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:30:42.371 INFO:tasks.workunit.client.0.vm04.stderr: Thick provisioning: 11% complete... Thick provisioning: 12% complete...2026-03-25T15:30:42.369+0000 7fa16750c640 0 -- 192.168.123.104:0/2609957404 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x7fa14805efc0 msgr2=0x7fa14807f3a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:30:57.824 INFO:tasks.workunit.client.0.vm04.stderr: Thick provisioning: 13% complete... Thick provisioning: 14% complete... Thick provisioning: 15% complete... Thick provisioning: 16% complete... Thick provisioning: 17% complete... Thick provisioning: 18% complete... Thick provisioning: 19% complete... Thick provisioning: 20% complete... Thick provisioning: 21% complete... Thick provisioning: 22% complete... Thick provisioning: 23% complete... Thick provisioning: 24% complete... Thick provisioning: 25% complete... Thick provisioning: 26% complete... Thick provisioning: 27% complete... Thick provisioning: 28% complete... Thick provisioning: 29% complete... Thick provisioning: 30% complete... Thick provisioning: 31% complete... Thick provisioning: 32% complete... Thick provisioning: 33% complete... Thick provisioning: 34% complete... Thick provisioning: 35% complete... Thick provisioning: 36% complete... Thick provisioning: 37% complete... Thick provisioning: 38% complete... Thick provisioning: 39% complete... Thick provisioning: 40% complete... Thick provisioning: 41% complete... Thick provisioning: 42% complete... Thick provisioning: 43% complete... Thick provisioning: 44% complete... Thick provisioning: 45% complete... Thick provisioning: 46% complete... Thick provisioning: 47% complete... Thick provisioning: 48% complete... Thick provisioning: 49% complete... Thick provisioning: 50% complete... Thick provisioning: 51% complete... Thick provisioning: 52% complete... Thick provisioning: 53% complete... Thick provisioning: 54% complete... Thick provisioning: 55% complete... Thick provisioning: 56% complete... Thick provisioning: 57% complete... Thick provisioning: 58% complete... Thick provisioning: 59% complete... Thick provisioning: 60% complete... Thick provisioning: 61% complete... Thick provisioning: 62% complete... Thick provisioning: 63% complete... Thick provisioning: 64% complete... Thick provisioning: 65% complete... Thick provisioning: 66% complete... Thick provisioning: 67% complete... Thick provisioning: 68% complete... Thick provisioning: 69% complete... Thick provisioning: 70% complete... Thick provisioning: 71% complete... Thick provisioning: 72% complete... Thick provisioning: 73% complete... Thick provisioning: 74% complete... Thick provisioning: 75% complete... Thick provisioning: 76% complete... Thick provisioning: 77% complete... Thick provisioning: 78% complete... Thick provisioning: 79% complete... Thick provisioning: 80% complete... Thick provisioning: 81% complete... Thick provisioning: 82% complete... Thick provisioning: 83% complete... Thick provisioning: 84% complete... Thick provisioning: 85% complete... Thick provisioning: 86% complete... Thick provisioning: 87% complete... Thick provisioning: 88% complete... Thick provisioning: 89% complete... Thick provisioning: 90% complete... Thick provisioning: 91% complete... Thick provisioning: 92% complete... Thick provisioning: 93% complete... Thick provisioning: 94% complete... Thick provisioning: 95% complete... Thick provisioning: 96% complete... Thick provisioning: 97% complete... Thick provisioning: 98% complete... Thick provisioning: 99% complete... Thick provisioning: 100% complete... Thick provisioning: 100% complete...done. 2026-03-25T15:30:57.830 INFO:tasks.workunit.client.0.vm04.stderr:+ count=0 2026-03-25T15:30:57.830 INFO:tasks.workunit.client.0.vm04.stderr:+ ret= 2026-03-25T15:30:57.830 INFO:tasks.workunit.client.0.vm04.stderr:+ '[' 0 -lt 10 ']' 2026-03-25T15:30:57.830 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd du 2026-03-25T15:30:57.830 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test1 2026-03-25T15:30:57.830 INFO:tasks.workunit.client.0.vm04.stderr:+ tr -s ' ' 2026-03-25T15:30:57.831 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^4 GiB' 2026-03-25T15:30:57.831 INFO:tasks.workunit.client.0.vm04.stderr:+ cut -d ' ' -f 4-5 2026-03-25T15:30:57.857 INFO:tasks.workunit.client.0.vm04.stderr:warning: fast-diff map is not enabled for test1. operation may be slow. 2026-03-25T15:30:57.867 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:30:57.866+0000 7f43bffff640 0 -- 192.168.123.104:0/2176587882 >> [v2:192.168.123.104:6816/2768043570,v1:192.168.123.104:6817/2768043570] conn(0x560a20d8f170 msgr2=0x560a20daf620 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-25T15:30:57.869 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:30:57.868+0000 7f43c5ce4640 0 -- 192.168.123.104:0/2176587882 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x560a20cdbca0 msgr2=0x560a20d2b3f0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:30:57.940 INFO:tasks.workunit.client.0.vm04.stdout:4 GiB 2026-03-25T15:30:57.940 INFO:tasks.workunit.client.0.vm04.stderr:+ ret=0 2026-03-25T15:30:57.941 INFO:tasks.workunit.client.0.vm04.stderr:+ '[' 0 = 0 ']' 2026-03-25T15:30:57.941 INFO:tasks.workunit.client.0.vm04.stderr:+ break 2026-03-25T15:30:57.941 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd du 2026-03-25T15:30:57.985 INFO:tasks.workunit.client.0.vm04.stderr:warning: fast-diff map is not enabled for test1. operation may be slow. 2026-03-25T15:30:57.991 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:30:57.990+0000 7fbee52e4640 0 -- 192.168.123.104:0/3113155813 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x55baa94d3ca0 msgr2=0x55baa94cc9c0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:30:57.993 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:30:57.992+0000 7fbee52e4640 0 -- 192.168.123.104:0/3113155813 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7fbec405edf0 msgr2=0x7fbec407f1f0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:30:58.051 INFO:tasks.workunit.client.0.vm04.stdout:NAME PROVISIONED USED 2026-03-25T15:30:58.051 INFO:tasks.workunit.client.0.vm04.stdout:test1 4 GiB 4 GiB 2026-03-25T15:30:58.055 INFO:tasks.workunit.client.0.vm04.stderr:+ '[' 0 '!=' 0 ']' 2026-03-25T15:30:58.055 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test1 2026-03-25T15:30:58.164 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 1% complete... Removing image: 2% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 5% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 8% complete... Removing image: 9% complete... Removing image: 10% complete...2026-03-25T15:30:58.163+0000 7f22ce4c7640 0 -- 192.168.123.104:0/614234694 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x55e913bd3080 msgr2=0x55e913bc3940 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:30:58.180 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 11% complete... Removing image: 12% complete...2026-03-25T15:30:58.178+0000 7f22ce4c7640 0 -- 192.168.123.104:0/614234694 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7f22ac05eef0 msgr2=0x7f22ac07f2f0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:30:58.884 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 13% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 16% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 19% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 22% complete... Removing image: 23% complete... Removing image: 24% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 27% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 30% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 33% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 36% complete... Removing image: 37% complete... Removing image: 38% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 41% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 44% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 47% complete... Removing image: 48% complete... Removing image: 49% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 52% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 55% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 58% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 61% complete... Removing image: 62% complete... Removing image: 63% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 66% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 69% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 72% complete... Removing image: 73% complete... Removing image: 74% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 77% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 80% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 83% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 86% complete... Removing image: 87% complete... Removing image: 88% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 91% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 94% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 97% complete... Removing image: 98% complete... Removing image: 99% complete... Removing image: 100% complete...done. 2026-03-25T15:30:58.888 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-25T15:30:58.888 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test1 2026-03-25T15:30:58.888 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:30:58.889 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^0$' 2026-03-25T15:30:58.914 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-25T15:30:58.914 INFO:tasks.workunit.client.0.vm04.stderr:+ RBD_CREATE_ARGS='--image-format 2' 2026-03-25T15:30:58.914 INFO:tasks.workunit.client.0.vm04.stderr:+ test_others 2026-03-25T15:30:58.914 INFO:tasks.workunit.client.0.vm04.stdout:testing import, export, resize, and snapshots... 2026-03-25T15:30:58.914 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing import, export, resize, and snapshots...' 2026-03-25T15:30:58.914 INFO:tasks.workunit.client.0.vm04.stderr:+ TMP_FILES='/tmp/img1 /tmp/img1.new /tmp/img2 /tmp/img2.new /tmp/img3 /tmp/img3.new /tmp/img-diff1.new /tmp/img-diff2.new /tmp/img-diff3.new /tmp/img1.snap1 /tmp/img1.snap1 /tmp/img-diff1.snap1' 2026-03-25T15:30:58.914 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-25T15:30:58.914 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:58.991 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:59.071 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:59.149 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:59.226 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:59.302 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:59.376 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:59.452 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:59.526 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:59.602 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:59.680 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:59.754 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:59.825 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:59.896 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:30:59.973 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:00.046 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:00.117 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:00.189 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:00.262 INFO:tasks.workunit.client.0.vm04.stderr:+ rm -f /tmp/img1 /tmp/img1.new /tmp/img2 /tmp/img2.new /tmp/img3 /tmp/img3.new /tmp/img-diff1.new /tmp/img-diff2.new /tmp/img-diff3.new /tmp/img1.snap1 /tmp/img1.snap1 /tmp/img-diff1.snap1 2026-03-25T15:31:00.262 INFO:tasks.workunit.client.0.vm04.stderr:+ dd if=/bin/sh of=/tmp/img1 bs=1k count=1 seek=10 2026-03-25T15:31:00.263 INFO:tasks.workunit.client.0.vm04.stderr:1+0 records in 2026-03-25T15:31:00.263 INFO:tasks.workunit.client.0.vm04.stderr:1+0 records out 2026-03-25T15:31:00.263 INFO:tasks.workunit.client.0.vm04.stderr:1024 bytes (1.0 kB, 1.0 KiB) copied, 6.406e-05 s, 16.0 MB/s 2026-03-25T15:31:00.263 INFO:tasks.workunit.client.0.vm04.stderr:+ dd if=/bin/dd of=/tmp/img1 bs=1k count=10 seek=100 2026-03-25T15:31:00.264 INFO:tasks.workunit.client.0.vm04.stderr:10+0 records in 2026-03-25T15:31:00.264 INFO:tasks.workunit.client.0.vm04.stderr:10+0 records out 2026-03-25T15:31:00.264 INFO:tasks.workunit.client.0.vm04.stderr:10240 bytes (10 kB, 10 KiB) copied, 6.5352e-05 s, 157 MB/s 2026-03-25T15:31:00.264 INFO:tasks.workunit.client.0.vm04.stderr:+ dd if=/bin/rm of=/tmp/img1 bs=1k count=100 seek=1000 2026-03-25T15:31:00.265 INFO:tasks.workunit.client.0.vm04.stderr:59+1 records in 2026-03-25T15:31:00.265 INFO:tasks.workunit.client.0.vm04.stderr:59+1 records out 2026-03-25T15:31:00.265 INFO:tasks.workunit.client.0.vm04.stderr:61432 bytes (61 kB, 60 KiB) copied, 0.000139952 s, 439 MB/s 2026-03-25T15:31:00.265 INFO:tasks.workunit.client.0.vm04.stderr:+ dd if=/bin/ls of=/tmp/img1 bs=1k seek=10000 2026-03-25T15:31:00.266 INFO:tasks.workunit.client.0.vm04.stderr:137+1 records in 2026-03-25T15:31:00.266 INFO:tasks.workunit.client.0.vm04.stderr:137+1 records out 2026-03-25T15:31:00.266 INFO:tasks.workunit.client.0.vm04.stderr:140744 bytes (141 kB, 137 KiB) copied, 0.00028827 s, 488 MB/s 2026-03-25T15:31:00.266 INFO:tasks.workunit.client.0.vm04.stderr:+ dd if=/bin/ln of=/tmp/img1 bs=1k seek=100000 2026-03-25T15:31:00.267 INFO:tasks.workunit.client.0.vm04.stderr:55+1 records in 2026-03-25T15:31:00.267 INFO:tasks.workunit.client.0.vm04.stderr:55+1 records out 2026-03-25T15:31:00.267 INFO:tasks.workunit.client.0.vm04.stderr:57280 bytes (57 kB, 56 KiB) copied, 0.000141474 s, 405 MB/s 2026-03-25T15:31:00.267 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd import --image-format 2 /tmp/img1 testimg1 2026-03-25T15:31:00.358 INFO:tasks.workunit.client.0.vm04.stderr: Importing image: 4% complete... Importing image: 8% complete... Importing image: 12% complete... Importing image: 16% complete... Importing image: 20% complete... Importing image: 24% complete... Importing image: 28% complete... Importing image: 32% complete... Importing image: 36% complete... Importing image: 40% complete... Importing image: 45% complete... Importing image: 49% complete... Importing image: 53% complete... Importing image: 57% complete... Importing image: 61% complete... Importing image: 65% complete... Importing image: 69% complete... Importing image: 73% complete... Importing image: 77% complete... Importing image: 81% complete... Importing image: 85% complete... Importing image: 90% complete... Importing image: 94% complete... Importing image: 98% complete... Importing image: 100% complete...done. 2026-03-25T15:31:00.362 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd resize testimg1 --size=256 --allow-shrink 2026-03-25T15:31:00.399 INFO:tasks.workunit.client.0.vm04.stderr: Resizing image: 100% complete...done. 2026-03-25T15:31:00.406 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export testimg1 /tmp/img2 2026-03-25T15:31:00.470 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 1% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 12% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 23% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 37% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 48% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 62% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 73% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 87% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 98% complete... Exporting image: 100% complete...done. 2026-03-25T15:31:00.476 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create testimg1 --snap=snap1 2026-03-25T15:31:00.958 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-25T15:31:00.969 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd resize testimg1 --size=128 2026-03-25T15:31:01.003 INFO:tasks.workunit.client.0.vm04.stderr:rbd: shrinking an image is only allowed with the --allow-shrink flag 2026-03-25T15:31:01.008 INFO:tasks.workunit.client.0.vm04.stderr:+ true 2026-03-25T15:31:01.009 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd resize testimg1 --size=128 --allow-shrink 2026-03-25T15:31:01.051 INFO:tasks.workunit.client.0.vm04.stderr: Resizing image: 50% complete... Resizing image: 51% complete... Resizing image: 53% complete... Resizing image: 54% complete... Resizing image: 56% complete... Resizing image: 57% complete... Resizing image: 59% complete... Resizing image: 60% complete... Resizing image: 62% complete... Resizing image: 64% complete... Resizing image: 65% complete... Resizing image: 67% complete... Resizing image: 68% complete... Resizing image: 70% complete... Resizing image: 71% complete... Resizing image: 73% complete... Resizing image: 75% complete... Resizing image: 76% complete... Resizing image: 78% complete... Resizing image: 79% complete... Resizing image: 81% complete... Resizing image: 82% complete... Resizing image: 84% complete... Resizing image: 85% complete... Resizing image: 87% complete... Resizing image: 89% complete... Resizing image: 90% complete... Resizing image: 92% complete... Resizing image: 93% complete... Resizing image: 95% complete... Resizing image: 96% complete... Resizing image: 98% complete... Resizing image: 100% complete...done. 2026-03-25T15:31:01.059 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export testimg1 /tmp/img3 2026-03-25T15:31:01.112 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 3% complete... Exporting image: 6% complete... Exporting image: 9% complete... Exporting image: 12% complete... Exporting image: 15% complete... Exporting image: 18% complete... Exporting image: 21% complete... Exporting image: 25% complete... Exporting image: 28% complete... Exporting image: 31% complete... Exporting image: 34% complete... Exporting image: 37% complete... Exporting image: 40% complete... Exporting image: 43% complete... Exporting image: 46% complete... Exporting image: 50% complete... Exporting image: 53% complete... Exporting image: 56% complete... Exporting image: 59% complete... Exporting image: 62% complete... Exporting image: 65% complete... Exporting image: 68% complete... Exporting image: 71% complete... Exporting image: 75% complete... Exporting image: 78% complete... Exporting image: 81% complete... Exporting image: 84% complete... Exporting image: 87% complete... Exporting image: 90% complete... Exporting image: 93% complete... Exporting image: 96% complete... Exporting image: 100% complete...done. 2026-03-25T15:31:01.118 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg1 2026-03-25T15:31:01.118 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 128 MiB' 2026-03-25T15:31:01.153 INFO:tasks.workunit.client.0.vm04.stdout: size 128 MiB in 32 objects 2026-03-25T15:31:01.153 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info --snap=snap1 testimg1 2026-03-25T15:31:01.154 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 256 MiB' 2026-03-25T15:31:01.188 INFO:tasks.workunit.client.0.vm04.stdout: size 256 MiB in 64 objects 2026-03-25T15:31:01.188 INFO:tasks.workunit.client.0.vm04.stderr:+ rm -rf /tmp/diff-testimg1-1 /tmp/diff-testimg1-2 2026-03-25T15:31:01.189 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export-diff testimg1 --snap=snap1 /tmp/diff-testimg1-1 2026-03-25T15:31:01.227 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 3% complete... Exporting image: 37% complete... Exporting image: 100% complete...done. 2026-03-25T15:31:01.231 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export-diff testimg1 --from-snap=snap1 /tmp/diff-testimg1-2 2026-03-25T15:31:01.266 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 100% complete...done. 2026-03-25T15:31:01.270 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size=1 testimg-diff1 2026-03-25T15:31:01.310 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd import-diff --sparse-size 8K /tmp/diff-testimg1-1 testimg-diff1 2026-03-25T15:31:01.984 INFO:tasks.workunit.client.0.vm04.stderr: Importing image diff: 22% complete... Importing image diff: 63% complete... Importing image diff: 99% complete... Importing image diff: 100% complete...done. 2026-03-25T15:31:01.994 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd import-diff --sparse-size 8K /tmp/diff-testimg1-2 testimg-diff1 2026-03-25T15:31:02.037 INFO:tasks.workunit.client.0.vm04.stderr: Importing image diff: 68% complete... Importing image diff: 96% complete... Importing image diff: 100% complete...done. 2026-03-25T15:31:02.044 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg1 2026-03-25T15:31:02.044 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 128 MiB' 2026-03-25T15:31:02.077 INFO:tasks.workunit.client.0.vm04.stdout: size 128 MiB in 32 objects 2026-03-25T15:31:02.078 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info --snap=snap1 testimg1 2026-03-25T15:31:02.078 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 256 MiB' 2026-03-25T15:31:02.112 INFO:tasks.workunit.client.0.vm04.stdout: size 256 MiB in 64 objects 2026-03-25T15:31:02.113 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg-diff1 2026-03-25T15:31:02.113 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 128 MiB' 2026-03-25T15:31:02.145 INFO:tasks.workunit.client.0.vm04.stdout: size 128 MiB in 32 objects 2026-03-25T15:31:02.145 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info --snap=snap1 testimg-diff1 2026-03-25T15:31:02.145 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 256 MiB' 2026-03-25T15:31:02.180 INFO:tasks.workunit.client.0.vm04.stdout: size 256 MiB in 64 objects 2026-03-25T15:31:02.180 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd copy testimg1 --snap=snap1 testimg2 2026-03-25T15:31:02.241 INFO:tasks.workunit.client.0.vm04.stderr: Image copy: 3% complete... Image copy: 37% complete... Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-25T15:31:02.246 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd copy testimg1 testimg3 2026-03-25T15:31:02.316 INFO:tasks.workunit.client.0.vm04.stderr: Image copy: 3% complete... Image copy: 6% complete... Image copy: 9% complete... Image copy: 12% complete... Image copy: 15% complete... Image copy: 18% complete... Image copy: 21% complete... Image copy: 25% complete... Image copy: 28% complete... Image copy: 31% complete... Image copy: 34% complete... Image copy: 37% complete... Image copy: 40% complete... Image copy: 43% complete... Image copy: 46% complete... Image copy: 50% complete... Image copy: 53% complete... Image copy: 56% complete... Image copy: 59% complete... Image copy: 62% complete... Image copy: 65% complete... Image copy: 68% complete... Image copy: 71% complete... Image copy: 75% complete... Image copy: 78% complete... Image copy: 81% complete... Image copy: 84% complete... Image copy: 87% complete... Image copy: 90% complete... Image copy: 93% complete... Image copy: 96% complete... Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-25T15:31:02.322 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd copy testimg-diff1 --sparse-size 768K --snap=snap1 testimg-diff2 2026-03-25T15:31:02.399 INFO:tasks.workunit.client.0.vm04.stderr: Image copy: 3% complete... Image copy: 37% complete... Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-25T15:31:02.404 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd copy testimg-diff1 --sparse-size 768K testimg-diff3 2026-03-25T15:31:02.480 INFO:tasks.workunit.client.0.vm04.stderr: Image copy: 3% complete... Image copy: 6% complete... Image copy: 9% complete... Image copy: 12% complete... Image copy: 15% complete... Image copy: 18% complete... Image copy: 21% complete... Image copy: 25% complete... Image copy: 28% complete... Image copy: 31% complete... Image copy: 34% complete... Image copy: 37% complete... Image copy: 40% complete... Image copy: 43% complete... Image copy: 46% complete... Image copy: 50% complete... Image copy: 53% complete... Image copy: 56% complete... Image copy: 59% complete... Image copy: 62% complete... Image copy: 65% complete... Image copy: 68% complete... Image copy: 71% complete... Image copy: 75% complete... Image copy: 78% complete... Image copy: 81% complete... Image copy: 84% complete... Image copy: 87% complete... Image copy: 90% complete... Image copy: 93% complete... Image copy: 96% complete... Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-25T15:31:02.485 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg2 2026-03-25T15:31:02.486 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 256 MiB' 2026-03-25T15:31:02.520 INFO:tasks.workunit.client.0.vm04.stdout: size 256 MiB in 64 objects 2026-03-25T15:31:02.521 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg3 2026-03-25T15:31:02.521 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 128 MiB' 2026-03-25T15:31:02.554 INFO:tasks.workunit.client.0.vm04.stdout: size 128 MiB in 32 objects 2026-03-25T15:31:02.555 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg-diff2 2026-03-25T15:31:02.555 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 256 MiB' 2026-03-25T15:31:02.588 INFO:tasks.workunit.client.0.vm04.stdout: size 256 MiB in 64 objects 2026-03-25T15:31:02.588 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg-diff3 2026-03-25T15:31:02.588 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 128 MiB' 2026-03-25T15:31:02.621 INFO:tasks.workunit.client.0.vm04.stdout: size 128 MiB in 32 objects 2026-03-25T15:31:02.622 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd deep copy testimg1 testimg4 2026-03-25T15:31:03.054 INFO:tasks.workunit.client.0.vm04.stderr: Image deep copy: 1% complete... Image deep copy: 3% complete... Image deep copy: 4% complete... Image deep copy: 6% complete... Image deep copy: 7% complete... Image deep copy: 9% complete... Image deep copy: 10% complete... Image deep copy: 12% complete... Image deep copy: 14% complete... Image deep copy: 15% complete... Image deep copy: 17% complete... Image deep copy: 18% complete... Image deep copy: 20% complete... Image deep copy: 21% complete... Image deep copy: 23% complete... Image deep copy: 25% complete... Image deep copy: 26% complete... Image deep copy: 28% complete... Image deep copy: 29% complete... Image deep copy: 31% complete... Image deep copy: 32% complete... Image deep copy: 34% complete... Image deep copy: 35% complete... Image deep copy: 37% complete... Image deep copy: 39% complete... Image deep copy: 40% complete... Image deep copy: 42% complete... Image deep copy: 43% complete... Image deep copy: 45% complete... Image deep copy: 46% complete... Image deep copy: 48% complete... Image deep copy: 50% complete... Image deep copy: 51% complete... Image deep copy: 53% complete... Image deep copy: 54% complete... Image deep copy: 56% complete... Image deep copy: 57% complete... Image deep copy: 59% complete... Image deep copy: 60% complete... Image deep copy: 62% complete... Image deep copy: 64% complete... Image deep copy: 65% complete... Image deep copy: 67% complete... Image deep copy: 68% complete... Image deep copy: 70% complete... Image deep copy: 71% complete... Image deep copy: 73% complete... Image deep copy: 75% complete... Image deep copy: 76% complete... Image deep copy: 78% complete... Image deep copy: 79% complete... Image deep copy: 81% complete... Image deep copy: 82% complete... Image deep copy: 84% complete... Image deep copy: 85% complete... Image deep copy: 87% complete... Image deep copy: 89% complete... Image deep copy: 90% complete... Image deep copy: 92% complete... Image deep copy: 93% complete... Image deep copy: 95% complete... Image deep copy: 96% complete... Image deep copy: 98% complete... Image deep copy: 100% complete... Image deep copy: 100% complete...done. 2026-03-25T15:31:03.058 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd deep copy testimg1 --snap=snap1 testimg5 2026-03-25T15:31:04.019 INFO:tasks.workunit.client.0.vm04.stderr: Image deep copy: 1% complete... Image deep copy: 3% complete... Image deep copy: 4% complete... Image deep copy: 6% complete... Image deep copy: 7% complete... Image deep copy: 9% complete... Image deep copy: 10% complete... Image deep copy: 12% complete... Image deep copy: 14% complete... Image deep copy: 15% complete... Image deep copy: 17% complete... Image deep copy: 18% complete... Image deep copy: 20% complete... Image deep copy: 21% complete... Image deep copy: 23% complete... Image deep copy: 25% complete... Image deep copy: 26% complete... Image deep copy: 28% complete... Image deep copy: 29% complete... Image deep copy: 31% complete... Image deep copy: 32% complete... Image deep copy: 34% complete... Image deep copy: 35% complete... Image deep copy: 37% complete... Image deep copy: 39% complete... Image deep copy: 40% complete... Image deep copy: 42% complete... Image deep copy: 43% complete... Image deep copy: 45% complete... Image deep copy: 46% complete... Image deep copy: 48% complete... Image deep copy: 50% complete... Image deep copy: 51% complete... Image deep copy: 53% complete... Image deep copy: 54% complete... Image deep copy: 56% complete... Image deep copy: 57% complete... Image deep copy: 59% complete... Image deep copy: 60% complete... Image deep copy: 62% complete... Image deep copy: 64% complete... Image deep copy: 65% complete... Image deep copy: 67% complete... Image deep copy: 68% complete... Image deep copy: 70% complete... Image deep copy: 71% complete... Image deep copy: 73% complete... Image deep copy: 75% complete... Image deep copy: 76% complete... Image deep copy: 78% complete... Image deep copy: 79% complete... Image deep copy: 81% complete... Image deep copy: 82% complete... Image deep copy: 84% complete... Image deep copy: 85% complete... Image deep copy: 87% complete... Image deep copy: 89% complete... Image deep copy: 90% complete... Image deep copy: 92% complete... Image deep copy: 93% complete... Image deep copy: 95% complete... Image deep copy: 96% complete... Image deep copy: 98% complete... Image deep copy: 100% complete... Image deep copy: 100% complete...done. 2026-03-25T15:31:04.023 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg4 2026-03-25T15:31:04.023 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 128 MiB' 2026-03-25T15:31:04.057 INFO:tasks.workunit.client.0.vm04.stdout: size 128 MiB in 32 objects 2026-03-25T15:31:04.058 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg5 2026-03-25T15:31:04.058 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 256 MiB' 2026-03-25T15:31:04.090 INFO:tasks.workunit.client.0.vm04.stdout: size 256 MiB in 64 objects 2026-03-25T15:31:04.090 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap ls testimg4 2026-03-25T15:31:04.090 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -v SNAPID 2026-03-25T15:31:04.090 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:31:04.090 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 1 2026-03-25T15:31:04.123 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-25T15:31:04.123 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap ls testimg4 2026-03-25T15:31:04.123 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '.*snap1.*' 2026-03-25T15:31:04.157 INFO:tasks.workunit.client.0.vm04.stdout: 15 snap1 256 MiB Wed Mar 25 15:31:02 2026 2026-03-25T15:31:04.157 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export testimg1 /tmp/img1.new 2026-03-25T15:31:04.213 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 3% complete... Exporting image: 6% complete... Exporting image: 9% complete... Exporting image: 12% complete... Exporting image: 15% complete... Exporting image: 18% complete... Exporting image: 21% complete... Exporting image: 25% complete... Exporting image: 28% complete... Exporting image: 31% complete... Exporting image: 34% complete... Exporting image: 37% complete... Exporting image: 40% complete... Exporting image: 43% complete... Exporting image: 46% complete... Exporting image: 50% complete... Exporting image: 53% complete... Exporting image: 56% complete... Exporting image: 59% complete... Exporting image: 62% complete... Exporting image: 65% complete... Exporting image: 68% complete... Exporting image: 71% complete... Exporting image: 75% complete... Exporting image: 78% complete... Exporting image: 81% complete... Exporting image: 84% complete... Exporting image: 87% complete... Exporting image: 90% complete... Exporting image: 93% complete... Exporting image: 96% complete... Exporting image: 100% complete...done. 2026-03-25T15:31:04.218 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export testimg2 /tmp/img2.new 2026-03-25T15:31:04.290 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 1% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 12% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 23% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 37% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 48% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 62% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 73% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 87% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 98% complete... Exporting image: 100% complete...done. 2026-03-25T15:31:04.296 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export testimg3 /tmp/img3.new 2026-03-25T15:31:04.355 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 3% complete... Exporting image: 6% complete... Exporting image: 9% complete... Exporting image: 12% complete... Exporting image: 15% complete... Exporting image: 18% complete... Exporting image: 21% complete... Exporting image: 25% complete... Exporting image: 28% complete... Exporting image: 31% complete... Exporting image: 34% complete... Exporting image: 37% complete... Exporting image: 40% complete... Exporting image: 43% complete... Exporting image: 46% complete... Exporting image: 50% complete... Exporting image: 53% complete... Exporting image: 56% complete... Exporting image: 59% complete... Exporting image: 62% complete... Exporting image: 65% complete... Exporting image: 68% complete... Exporting image: 71% complete... Exporting image: 75% complete... Exporting image: 78% complete... Exporting image: 81% complete... Exporting image: 84% complete... Exporting image: 87% complete... Exporting image: 90% complete... Exporting image: 93% complete... Exporting image: 96% complete... Exporting image: 100% complete...done. 2026-03-25T15:31:04.363 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export testimg-diff1 /tmp/img-diff1.new 2026-03-25T15:31:04.429 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 3% complete... Exporting image: 6% complete... Exporting image: 9% complete... Exporting image: 12% complete... Exporting image: 15% complete... Exporting image: 18% complete... Exporting image: 21% complete... Exporting image: 25% complete... Exporting image: 28% complete... Exporting image: 31% complete... Exporting image: 34% complete... Exporting image: 37% complete... Exporting image: 40% complete... Exporting image: 43% complete... Exporting image: 46% complete... Exporting image: 50% complete... Exporting image: 53% complete... Exporting image: 56% complete... Exporting image: 59% complete... Exporting image: 62% complete... Exporting image: 65% complete... Exporting image: 68% complete... Exporting image: 71% complete... Exporting image: 75% complete... Exporting image: 78% complete... Exporting image: 81% complete... Exporting image: 84% complete... Exporting image: 87% complete... Exporting image: 90% complete... Exporting image: 93% complete... Exporting image: 96% complete... Exporting image: 100% complete...done. 2026-03-25T15:31:04.435 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export testimg-diff2 /tmp/img-diff2.new 2026-03-25T15:31:04.516 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 1% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 12% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 23% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 37% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 48% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 62% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 73% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 87% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 98% complete... Exporting image: 100% complete...done. 2026-03-25T15:31:04.523 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export testimg-diff3 /tmp/img-diff3.new 2026-03-25T15:31:04.586 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 3% complete... Exporting image: 6% complete... Exporting image: 9% complete... Exporting image: 12% complete... Exporting image: 15% complete... Exporting image: 18% complete... Exporting image: 21% complete... Exporting image: 25% complete... Exporting image: 28% complete... Exporting image: 31% complete... Exporting image: 34% complete... Exporting image: 37% complete... Exporting image: 40% complete... Exporting image: 43% complete... Exporting image: 46% complete... Exporting image: 50% complete... Exporting image: 53% complete... Exporting image: 56% complete... Exporting image: 59% complete... Exporting image: 62% complete... Exporting image: 65% complete... Exporting image: 68% complete... Exporting image: 71% complete... Exporting image: 75% complete... Exporting image: 78% complete... Exporting image: 81% complete... Exporting image: 84% complete... Exporting image: 87% complete... Exporting image: 90% complete... Exporting image: 93% complete... Exporting image: 96% complete... Exporting image: 100% complete...done. 2026-03-25T15:31:04.592 INFO:tasks.workunit.client.0.vm04.stderr:+ cmp /tmp/img2 /tmp/img2.new 2026-03-25T15:31:04.757 INFO:tasks.workunit.client.0.vm04.stderr:+ cmp /tmp/img3 /tmp/img3.new 2026-03-25T15:31:04.835 INFO:tasks.workunit.client.0.vm04.stderr:+ cmp /tmp/img2 /tmp/img-diff2.new 2026-03-25T15:31:04.960 INFO:tasks.workunit.client.0.vm04.stderr:+ cmp /tmp/img3 /tmp/img-diff3.new 2026-03-25T15:31:05.020 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rollback --snap=snap1 testimg1 2026-03-25T15:31:05.143 INFO:tasks.workunit.client.0.vm04.stderr: Rolling back to snapshot: 1% complete... Rolling back to snapshot: 3% complete... Rolling back to snapshot: 4% complete... Rolling back to snapshot: 6% complete... Rolling back to snapshot: 7% complete... Rolling back to snapshot: 9% complete... Rolling back to snapshot: 10% complete... Rolling back to snapshot: 12% complete... Rolling back to snapshot: 14% complete... Rolling back to snapshot: 15% complete... Rolling back to snapshot: 17% complete... Rolling back to snapshot: 18% complete... Rolling back to snapshot: 20% complete... Rolling back to snapshot: 21% complete... Rolling back to snapshot: 23% complete... Rolling back to snapshot: 25% complete... Rolling back to snapshot: 26% complete... Rolling back to snapshot: 28% complete... Rolling back to snapshot: 29% complete... Rolling back to snapshot: 31% complete... Rolling back to snapshot: 32% complete... Rolling back to snapshot: 34% complete... Rolling back to snapshot: 35% complete... Rolling back to snapshot: 37% complete... Rolling back to snapshot: 39% complete... Rolling back to snapshot: 40% complete... Rolling back to snapshot: 42% complete... Rolling back to snapshot: 43% complete... Rolling back to snapshot: 45% complete... Rolling back to snapshot: 46% complete... Rolling back to snapshot: 48% complete... Rolling back to snapshot: 50% complete... Rolling back to snapshot: 51% complete... Rolling back to snapshot: 53% complete... Rolling back to snapshot: 54% complete... Rolling back to snapshot: 56% complete... Rolling back to snapshot: 57% complete... Rolling back to snapshot: 59% complete... Rolling back to snapshot: 60% complete... Rolling back to snapshot: 62% complete... Rolling back to snapshot: 64% complete... Rolling back to snapshot: 65% complete... Rolling back to snapshot: 67% complete... Rolling back to snapshot: 68% complete... Rolling back to snapshot: 70% complete... Rolling back to snapshot: 71% complete... Rolling back to snapshot: 73% complete... Rolling back to snapshot: 75% complete... Rolling back to snapshot: 76% complete... Rolling back to snapshot: 78% complete... Rolling back to snapshot: 79% complete... Rolling back to snapshot: 81% complete... Rolling back to snapshot: 82% complete... Rolling back to snapshot: 84% complete... Rolling back to snapshot: 85% complete... Rolling back to snapshot: 87% complete... Rolling back to snapshot: 89% complete... Rolling back to snapshot: 90% complete... Rolling back to snapshot: 92% complete... Rolling back to snapshot: 93% complete... Rolling back to snapshot: 95% complete... Rolling back to snapshot: 96% complete... Rolling back to snapshot: 98% complete... Rolling back to snapshot: 100% complete...done. 2026-03-25T15:31:05.149 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rollback --snap=snap1 testimg-diff1 2026-03-25T15:31:05.204 INFO:tasks.workunit.client.0.vm04.stderr: Rolling back to snapshot: 1% complete... Rolling back to snapshot: 3% complete... Rolling back to snapshot: 4% complete... Rolling back to snapshot: 6% complete... Rolling back to snapshot: 7% complete... Rolling back to snapshot: 9% complete... Rolling back to snapshot: 10% complete... Rolling back to snapshot: 12% complete... Rolling back to snapshot: 14% complete... Rolling back to snapshot: 15% complete... Rolling back to snapshot: 17% complete... Rolling back to snapshot: 18% complete... Rolling back to snapshot: 20% complete... Rolling back to snapshot: 21% complete... Rolling back to snapshot: 23% complete... Rolling back to snapshot: 25% complete... Rolling back to snapshot: 26% complete... Rolling back to snapshot: 28% complete... Rolling back to snapshot: 29% complete... Rolling back to snapshot: 31% complete... Rolling back to snapshot: 32% complete... Rolling back to snapshot: 34% complete... Rolling back to snapshot: 35% complete... Rolling back to snapshot: 37% complete... Rolling back to snapshot: 39% complete... Rolling back to snapshot: 40% complete... Rolling back to snapshot: 42% complete... Rolling back to snapshot: 43% complete... Rolling back to snapshot: 45% complete... Rolling back to snapshot: 46% complete... Rolling back to snapshot: 48% complete... Rolling back to snapshot: 50% complete... Rolling back to snapshot: 51% complete... Rolling back to snapshot: 53% complete... Rolling back to snapshot: 54% complete... Rolling back to snapshot: 56% complete... Rolling back to snapshot: 57% complete... Rolling back to snapshot: 59% complete... Rolling back to snapshot: 60% complete... Rolling back to snapshot: 62% complete... Rolling back to snapshot: 64% complete... Rolling back to snapshot: 65% complete... Rolling back to snapshot: 67% complete... Rolling back to snapshot: 68% complete... Rolling back to snapshot: 70% complete... Rolling back to snapshot: 71% complete... Rolling back to snapshot: 73% complete... Rolling back to snapshot: 75% complete... Rolling back to snapshot: 76% complete... Rolling back to snapshot: 78% complete... Rolling back to snapshot: 79% complete... Rolling back to snapshot: 81% complete... Rolling back to snapshot: 82% complete... Rolling back to snapshot: 84% complete... Rolling back to snapshot: 85% complete... Rolling back to snapshot: 87% complete... Rolling back to snapshot: 89% complete... Rolling back to snapshot: 90% complete... Rolling back to snapshot: 92% complete... Rolling back to snapshot: 93% complete... Rolling back to snapshot: 95% complete... Rolling back to snapshot: 96% complete... Rolling back to snapshot: 98% complete... Rolling back to snapshot: 100% complete...done. 2026-03-25T15:31:05.210 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg1 2026-03-25T15:31:05.210 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 256 MiB' 2026-03-25T15:31:05.241 INFO:tasks.workunit.client.0.vm04.stdout: size 256 MiB in 64 objects 2026-03-25T15:31:05.241 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg-diff1 2026-03-25T15:31:05.241 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 256 MiB' 2026-03-25T15:31:05.271 INFO:tasks.workunit.client.0.vm04.stdout: size 256 MiB in 64 objects 2026-03-25T15:31:05.271 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export testimg1 /tmp/img1.snap1 2026-03-25T15:31:05.330 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 1% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 12% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 23% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 37% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 48% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 62% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 73% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 87% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 98% complete... Exporting image: 100% complete...done. 2026-03-25T15:31:05.336 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd export testimg-diff1 /tmp/img-diff1.snap1 2026-03-25T15:31:05.400 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 1% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 12% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 23% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 37% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 48% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 62% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 73% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 87% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 98% complete... Exporting image: 100% complete...done. 2026-03-25T15:31:05.404 INFO:tasks.workunit.client.0.vm04.stderr:+ cmp /tmp/img2 /tmp/img1.snap1 2026-03-25T15:31:05.530 INFO:tasks.workunit.client.0.vm04.stderr:+ cmp /tmp/img2 /tmp/img-diff1.snap1 2026-03-25T15:31:05.971 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm testimg2 2026-03-25T15:31:06.057 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 1% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 12% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 23% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 37% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 48% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 62% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 73% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 87% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 98% complete...2026-03-25T15:31:06.056+0000 7f01096f0640 0 -- 192.168.123.104:0/835616772 >> [v2:192.168.123.104:6816/2768043570,v1:192.168.123.104:6817/2768043570] conn(0x7f00f0002340 msgr2=0x5607b0a3d1b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:31:06.066 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:31:06.069 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm testimg3 2026-03-25T15:31:06.199 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete...2026-03-25T15:31:06.197+0000 7fe028fe4640 0 -- 192.168.123.104:0/339291375 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x55ac4afe1030 msgr2=0x55ac4afd1370 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:31:06.208 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:31:06.211 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create testimg2 -s 0 2026-03-25T15:31:06.241 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:31:06.239+0000 7f3ffe948300 -1 librbd: Forced V1 image creation. 2026-03-25T15:31:06.247 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd cp testimg2 testimg3 2026-03-25T15:31:06.288 INFO:tasks.workunit.client.0.vm04.stderr: Image copy: 100% complete...done. 2026-03-25T15:31:06.292 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd deep cp testimg2 testimg6 2026-03-25T15:31:06.334 INFO:tasks.workunit.client.0.vm04.stderr: Image deep copy: 100% complete...done. 2026-03-25T15:31:06.337 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm --snap=snap1 testimg1 2026-03-25T15:31:07.159 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-25T15:31:07.167 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm --snap=snap1 testimg-diff1 2026-03-25T15:31:08.163 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-25T15:31:08.170 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info --snap=snap1 testimg1 2026-03-25T15:31:08.170 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'error setting snapshot context: (2) No such file or directory' 2026-03-25T15:31:08.201 INFO:tasks.workunit.client.0.vm04.stdout:error setting snapshot context: (2) No such file or directory 2026-03-25T15:31:08.201 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info --snap=snap1 testimg-diff1 2026-03-25T15:31:08.201 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'error setting snapshot context: (2) No such file or directory' 2026-03-25T15:31:08.232 INFO:tasks.workunit.client.0.vm04.stdout:error setting snapshot context: (2) No such file or directory 2026-03-25T15:31:08.232 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd sparsify testimg1 2026-03-25T15:31:08.318 INFO:tasks.workunit.client.0.vm04.stderr: Image sparsify: 1% complete... Image sparsify: 3% complete... Image sparsify: 4% complete... Image sparsify: 6% complete... Image sparsify: 7% complete... Image sparsify: 9% complete... Image sparsify: 10% complete... Image sparsify: 12% complete... Image sparsify: 14% complete... Image sparsify: 15% complete... Image sparsify: 17% complete... Image sparsify: 18% complete... Image sparsify: 20% complete... Image sparsify: 21% complete... Image sparsify: 23% complete... Image sparsify: 25% complete... Image sparsify: 26% complete... Image sparsify: 28% complete... Image sparsify: 29% complete... Image sparsify: 31% complete... Image sparsify: 32% complete... Image sparsify: 34% complete... Image sparsify: 35% complete... Image sparsify: 37% complete... Image sparsify: 39% complete... Image sparsify: 40% complete... Image sparsify: 42% complete... Image sparsify: 43% complete... Image sparsify: 45% complete... Image sparsify: 46% complete... Image sparsify: 48% complete... Image sparsify: 50% complete... Image sparsify: 51% complete... Image sparsify: 53% complete... Image sparsify: 54% complete... Image sparsify: 56% complete... Image sparsify: 57% complete... Image sparsify: 59% complete... Image sparsify: 60% complete... Image sparsify: 62% complete... Image sparsify: 64% complete... Image sparsify: 65% complete... Image sparsify: 67% complete... Image sparsify: 68% complete... Image sparsify: 70% complete... Image sparsify: 71% complete... Image sparsify: 73% complete... Image sparsify: 75% complete... Image sparsify: 76% complete... Image sparsify: 78% complete... Image sparsify: 79% complete... Image sparsify: 81% complete... Image sparsify: 82% complete... Image sparsify: 84% complete... Image sparsify: 85% complete... Image sparsify: 87% complete... Image sparsify: 89% complete... Image sparsify: 90% complete... Image sparsify: 92% complete... Image sparsify: 93% complete... Image sparsify: 95% complete... Image sparsify: 96% complete... Image sparsify: 98% complete... Image sparsify: 100% complete...done. 2026-03-25T15:31:08.326 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-25T15:31:08.326 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:08.456 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:08.529 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:08.631 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:09.249 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:10.257 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:10.366 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:10.496 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:10.619 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:10.745 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:10.818 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:10.895 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:10.981 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:11.052 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:11.128 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:11.211 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:11.288 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:11.369 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:11.447 INFO:tasks.workunit.client.0.vm04.stderr:+ rm -f /tmp/img1 /tmp/img1.new /tmp/img2 /tmp/img2.new /tmp/img3 /tmp/img3.new /tmp/img-diff1.new /tmp/img-diff2.new /tmp/img-diff3.new /tmp/img1.snap1 /tmp/img1.snap1 /tmp/img-diff1.snap1 2026-03-25T15:31:11.572 INFO:tasks.workunit.client.0.vm04.stderr:+ test_locking 2026-03-25T15:31:11.572 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing locking...' 2026-03-25T15:31:11.572 INFO:tasks.workunit.client.0.vm04.stdout:testing locking... 2026-03-25T15:31:11.572 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-25T15:31:11.572 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:11.652 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:11.731 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:11.810 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:11.889 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:11.968 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:12.045 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:12.119 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:12.195 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:12.269 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:12.346 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:12.427 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:12.504 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:12.581 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:12.661 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:12.736 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:12.811 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:12.890 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:12.968 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 -s 1 test1 2026-03-25T15:31:13.012 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock list test1 2026-03-25T15:31:13.012 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:31:13.012 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^0$' 2026-03-25T15:31:13.044 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-25T15:31:13.045 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock add test1 id 2026-03-25T15:31:13.082 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock list test1 2026-03-25T15:31:13.082 INFO:tasks.workunit.client.0.vm04.stderr:+ grep ' 1 ' 2026-03-25T15:31:13.115 INFO:tasks.workunit.client.0.vm04.stdout:There is 1 exclusive lock on this image. 2026-03-25T15:31:13.115 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd lock list test1 2026-03-25T15:31:13.115 INFO:tasks.workunit.client.0.vm04.stderr:++ tail -n 1 2026-03-25T15:31:13.115 INFO:tasks.workunit.client.0.vm04.stderr:++ awk '{print $1;}' 2026-03-25T15:31:13.150 INFO:tasks.workunit.client.0.vm04.stderr:+ LOCKER=client.8326 2026-03-25T15:31:13.150 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock remove test1 id client.8326 2026-03-25T15:31:13.248 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock list test1 2026-03-25T15:31:13.249 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:31:13.249 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^0$' 2026-03-25T15:31:13.284 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-25T15:31:13.284 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock add test1 id --shared tag 2026-03-25T15:31:13.325 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock list test1 2026-03-25T15:31:13.325 INFO:tasks.workunit.client.0.vm04.stderr:+ grep ' 1 ' 2026-03-25T15:31:13.357 INFO:tasks.workunit.client.0.vm04.stdout:There is 1 shared lock on this image. 2026-03-25T15:31:13.358 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock add test1 id --shared tag 2026-03-25T15:31:13.400 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock list test1 2026-03-25T15:31:13.400 INFO:tasks.workunit.client.0.vm04.stderr:+ grep ' 2 ' 2026-03-25T15:31:13.435 INFO:tasks.workunit.client.0.vm04.stdout:There are 2 shared locks on this image. 2026-03-25T15:31:13.435 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock add test1 id2 --shared tag 2026-03-25T15:31:13.478 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock list test1 2026-03-25T15:31:13.478 INFO:tasks.workunit.client.0.vm04.stderr:+ grep ' 3 ' 2026-03-25T15:31:13.510 INFO:tasks.workunit.client.0.vm04.stdout:There are 3 shared locks on this image. 2026-03-25T15:31:13.511 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock list test1 2026-03-25T15:31:13.511 INFO:tasks.workunit.client.0.vm04.stderr:+ tail -n 1 2026-03-25T15:31:13.511 INFO:tasks.workunit.client.0.vm04.stderr:+ awk '{print $2, $1;}' 2026-03-25T15:31:13.511 INFO:tasks.workunit.client.0.vm04.stderr:+ xargs rbd lock remove test1 2026-03-25T15:31:14.372 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info test1 2026-03-25T15:31:14.372 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -qE 'features:.*exclusive' 2026-03-25T15:31:14.408 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd lock list test1 2026-03-25T15:31:14.448 INFO:tasks.workunit.client.0.vm04.stderr:+ '[' -n 'There are 2 shared locks on this image. 2026-03-25T15:31:14.449 INFO:tasks.workunit.client.0.vm04.stderr:Lock tag: tag 2026-03-25T15:31:14.449 INFO:tasks.workunit.client.0.vm04.stderr:Locker ID Address 2026-03-25T15:31:14.449 INFO:tasks.workunit.client.0.vm04.stderr:client.8341 id 192.168.123.104:0/2646297707 2026-03-25T15:31:14.449 INFO:tasks.workunit.client.0.vm04.stderr:client.8347 id 192.168.123.104:0/4292970630' ']' 2026-03-25T15:31:14.449 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock list test1 2026-03-25T15:31:14.449 INFO:tasks.workunit.client.0.vm04.stderr:+ tail -n 1 2026-03-25T15:31:14.449 INFO:tasks.workunit.client.0.vm04.stderr:+ awk '{print $2, $1;}' 2026-03-25T15:31:14.449 INFO:tasks.workunit.client.0.vm04.stderr:+ xargs rbd lock remove test1 2026-03-25T15:31:15.374 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd lock list test1 2026-03-25T15:31:15.409 INFO:tasks.workunit.client.0.vm04.stderr:+ '[' -n 'There is 1 shared lock on this image. 2026-03-25T15:31:15.409 INFO:tasks.workunit.client.0.vm04.stderr:Lock tag: tag 2026-03-25T15:31:15.409 INFO:tasks.workunit.client.0.vm04.stderr:Locker ID Address 2026-03-25T15:31:15.409 INFO:tasks.workunit.client.0.vm04.stderr:client.8341 id 192.168.123.104:0/2646297707' ']' 2026-03-25T15:31:15.409 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd lock list test1 2026-03-25T15:31:15.409 INFO:tasks.workunit.client.0.vm04.stderr:+ tail -n 1 2026-03-25T15:31:15.409 INFO:tasks.workunit.client.0.vm04.stderr:+ awk '{print $2, $1;}' 2026-03-25T15:31:15.409 INFO:tasks.workunit.client.0.vm04.stderr:+ xargs rbd lock remove test1 2026-03-25T15:31:16.380 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd lock list test1 2026-03-25T15:31:16.417 INFO:tasks.workunit.client.0.vm04.stderr:+ '[' -n '' ']' 2026-03-25T15:31:16.417 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test1 2026-03-25T15:31:16.495 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:31:16.493+0000 7ff370a02640 0 -- 192.168.123.104:0/632139804 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x7ff348008d20 msgr2=0x7ff3480291a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:31:16.503 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:31:16.507 INFO:tasks.workunit.client.0.vm04.stderr:+ test_clone 2026-03-25T15:31:16.507 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing clone...' 2026-03-25T15:31:16.507 INFO:tasks.workunit.client.0.vm04.stdout:testing clone... 2026-03-25T15:31:16.507 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-25T15:31:16.507 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:16.581 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:16.655 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:16.733 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:16.802 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:16.874 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:16.947 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:17.019 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:17.295 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:17.370 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:17.453 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:17.537 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:17.613 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:17.694 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:17.768 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:17.843 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:17.916 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:17.987 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:18.059 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create test1 --image-format 2 -s 1 2026-03-25T15:31:18.098 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create test1@s1 2026-03-25T15:31:18.385 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-25T15:31:18.393 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap protect test1@s1 2026-03-25T15:31:18.433 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool create rbd2 8 2026-03-25T15:31:19.585 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd2' already exists 2026-03-25T15:31:19.597 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd pool init rbd2 2026-03-25T15:31:22.628 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone test1@s1 rbd2/clone 2026-03-25T15:31:22.682 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd -p rbd2 ls 2026-03-25T15:31:22.682 INFO:tasks.workunit.client.0.vm04.stderr:+ grep clone 2026-03-25T15:31:22.709 INFO:tasks.workunit.client.0.vm04.stdout:clone 2026-03-25T15:31:22.709 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd -p rbd2 ls -l 2026-03-25T15:31:22.709 INFO:tasks.workunit.client.0.vm04.stderr:+ grep clone 2026-03-25T15:31:22.709 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test1@s1 2026-03-25T15:31:22.747 INFO:tasks.workunit.client.0.vm04.stdout:clone 1 MiB rbd/test1@s1 2 2026-03-25T15:31:22.747 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd ls 2026-03-25T15:31:22.774 INFO:tasks.workunit.client.0.vm04.stderr:+ test test1 = test1 2026-03-25T15:31:22.774 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd flatten rbd2/clone 2026-03-25T15:31:22.811 INFO:tasks.workunit.client.0.vm04.stderr: Image flatten: 100% complete...done. 2026-03-25T15:31:22.817 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create rbd2/clone@s1 2026-03-25T15:31:23.735 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-25T15:31:23.741 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap protect rbd2/clone@s1 2026-03-25T15:31:23.775 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone rbd2/clone@s1 clone2 2026-03-25T15:31:23.819 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-25T15:31:23.819 INFO:tasks.workunit.client.0.vm04.stderr:+ grep clone2 2026-03-25T15:31:23.841 INFO:tasks.workunit.client.0.vm04.stdout:clone2 2026-03-25T15:31:23.842 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls -l 2026-03-25T15:31:23.842 INFO:tasks.workunit.client.0.vm04.stderr:+ grep clone2 2026-03-25T15:31:23.842 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/clone@s1 2026-03-25T15:31:23.875 INFO:tasks.workunit.client.0.vm04.stdout:clone2 1 MiB rbd2/clone@s1 2 2026-03-25T15:31:23.875 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd -p rbd2 ls 2026-03-25T15:31:23.904 INFO:tasks.workunit.client.0.vm04.stderr:+ test clone = clone 2026-03-25T15:31:23.904 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone rbd2/clone clone3 2026-03-25T15:31:23.904 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'snapshot name was not specified' 2026-03-25T15:31:23.923 INFO:tasks.workunit.client.0.vm04.stdout:rbd: snapshot name was not specified 2026-03-25T15:31:23.923 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone rbd2/clone@invalid clone3 2026-03-25T15:31:23.923 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'failed to open parent image' 2026-03-25T15:31:23.956 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-25T15:31:23.952+0000 7fc66ffff640 -1 librbd::image::CloneRequest: 0x55bb45e1fa40 handle_open_parent: failed to open parent image: (2) No such file or directory 2026-03-25T15:31:23.957 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone rbd2/clone --snap-id 0 clone3 2026-03-25T15:31:23.957 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'failed to open parent image' 2026-03-25T15:31:23.989 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-25T15:31:23.985+0000 7f3eca7fc640 -1 librbd::image::CloneRequest: 0x55dfa35e4700 handle_open_parent: failed to open parent image: (2) No such file or directory 2026-03-25T15:31:23.989 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone rbd2/clone@invalid --snap-id 0 clone3 2026-03-25T15:31:23.989 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'trying to access snapshot using both name and id' 2026-03-25T15:31:24.009 INFO:tasks.workunit.client.0.vm04.stdout:rbd: trying to access snapshot using both name and id. 2026-03-25T15:31:24.009 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd snap ls rbd2/clone --format json 2026-03-25T15:31:24.010 INFO:tasks.workunit.client.0.vm04.stderr:++ jq '.[] | select(.name == "s1") | .id' 2026-03-25T15:31:24.041 INFO:tasks.workunit.client.0.vm04.stderr:+ SNAP_ID=3 2026-03-25T15:31:24.041 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --snap-id 3 rbd2/clone clone3 2026-03-25T15:31:24.087 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-25T15:31:24.087 INFO:tasks.workunit.client.0.vm04.stderr:+ grep clone3 2026-03-25T15:31:24.113 INFO:tasks.workunit.client.0.vm04.stdout:clone3 2026-03-25T15:31:24.114 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls -l 2026-03-25T15:31:24.114 INFO:tasks.workunit.client.0.vm04.stderr:+ grep clone3 2026-03-25T15:31:24.114 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/clone@s1 2026-03-25T15:31:24.153 INFO:tasks.workunit.client.0.vm04.stdout:clone3 1 MiB rbd2/clone@s1 2 2026-03-25T15:31:24.153 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd -p rbd2 ls 2026-03-25T15:31:24.182 INFO:tasks.workunit.client.0.vm04.stderr:+ test clone = clone 2026-03-25T15:31:24.182 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd ls -l 2026-03-25T15:31:24.182 INFO:tasks.workunit.client.0.vm04.stderr:++ grep -c rbd2/clone@s1 2026-03-25T15:31:24.220 INFO:tasks.workunit.client.0.vm04.stderr:+ test 2 = 2 2026-03-25T15:31:24.220 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd flatten clone3 2026-03-25T15:31:24.258 INFO:tasks.workunit.client.0.vm04.stderr: Image flatten: 100% complete...done. 2026-03-25T15:31:24.264 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd ls -l 2026-03-25T15:31:24.264 INFO:tasks.workunit.client.0.vm04.stderr:++ grep -c rbd2/clone@s1 2026-03-25T15:31:24.301 INFO:tasks.workunit.client.0.vm04.stderr:+ test 1 = 1 2026-03-25T15:31:24.301 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm clone2 2026-03-25T15:31:24.375 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:31:24.379 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap unprotect rbd2/clone@s1 2026-03-25T15:31:24.419 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm rbd2/clone@s1 2026-03-25T15:31:24.739 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-25T15:31:24.744 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm rbd2/clone 2026-03-25T15:31:24.803 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:31:24.801+0000 7fbfe2fee640 0 -- 192.168.123.104:0/3250090084 >> [v2:192.168.123.104:6816/2768043570,v1:192.168.123.104:6817/2768043570] conn(0x56063e2aa5d0 msgr2=0x56063e2b8c40 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:31:24.803 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:31:24.806 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm clone3 2026-03-25T15:31:24.870 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:31:24.869+0000 7fb641238640 0 -- 192.168.123.104:0/889481302 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x55b882af5030 msgr2=0x55b882ae5370 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:31:24.872 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:31:24.875 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap unprotect test1@s1 2026-03-25T15:31:24.910 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm test1@s1 2026-03-25T15:31:25.742 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-25T15:31:25.753 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test1 2026-03-25T15:31:25.820 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:31:25.818+0000 7ff7d0cf1640 0 -- 192.168.123.104:0/2404581725 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x56019b935cc0 msgr2=0x56019b911530 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:31:25.827 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:31:25.831 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool rm rbd2 rbd2 --yes-i-really-really-mean-it 2026-03-25T15:31:26.775 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd2' does not exist 2026-03-25T15:31:26.786 INFO:tasks.workunit.client.0.vm04.stderr:+ test_trash 2026-03-25T15:31:26.786 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing trash...' 2026-03-25T15:31:26.786 INFO:tasks.workunit.client.0.vm04.stdout:testing trash... 2026-03-25T15:31:26.786 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-25T15:31:26.786 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:26.864 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:26.936 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:27.009 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:27.081 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:27.356 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:27.484 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:27.559 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:27.637 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:27.712 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:27.787 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:27.862 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:27.939 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:28.016 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:28.090 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:28.165 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:28.235 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:28.306 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:28.376 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 -s 1 test1 2026-03-25T15:31:28.418 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 -s 1 test2 2026-03-25T15:31:28.457 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-25T15:31:28.457 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test1 2026-03-25T15:31:28.485 INFO:tasks.workunit.client.0.vm04.stdout:test1 2026-03-25T15:31:28.485 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-25T15:31:28.485 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test2 2026-03-25T15:31:28.513 INFO:tasks.workunit.client.0.vm04.stdout:test2 2026-03-25T15:31:28.513 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-25T15:31:28.513 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:31:28.513 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 2 2026-03-25T15:31:28.541 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-25T15:31:28.541 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls -l 2026-03-25T15:31:28.541 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'test1.*2.*' 2026-03-25T15:31:28.575 INFO:tasks.workunit.client.0.vm04.stdout:test1 1 MiB 2 2026-03-25T15:31:28.576 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls -l 2026-03-25T15:31:28.576 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'test2.*2.*' 2026-03-25T15:31:28.611 INFO:tasks.workunit.client.0.vm04.stdout:test2 1 MiB 2 2026-03-25T15:31:28.611 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv test1 2026-03-25T15:31:28.663 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-25T15:31:28.663 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test2 2026-03-25T15:31:28.691 INFO:tasks.workunit.client.0.vm04.stdout:test2 2026-03-25T15:31:28.691 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-25T15:31:28.691 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:31:28.691 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 1 2026-03-25T15:31:28.726 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-25T15:31:28.726 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls -l 2026-03-25T15:31:28.726 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'test2.*2.*' 2026-03-25T15:31:28.760 INFO:tasks.workunit.client.0.vm04.stdout:test2 1 MiB 2 2026-03-25T15:31:28.760 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:28.760 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test1 2026-03-25T15:31:28.788 INFO:tasks.workunit.client.0.vm04.stdout:21ffa34027de test1 2026-03-25T15:31:28.788 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:28.788 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:31:28.788 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 1 2026-03-25T15:31:28.815 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-25T15:31:28.815 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls -l 2026-03-25T15:31:28.815 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'test1.*USER.*' 2026-03-25T15:31:28.849 INFO:tasks.workunit.client.0.vm04.stdout:21ffa34027de test1 USER Wed Mar 25 15:31:28 2026 expired at Wed Mar 25 15:31:28 2026 2026-03-25T15:31:28.849 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls -l 2026-03-25T15:31:28.849 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -v 'protected until' 2026-03-25T15:31:28.885 INFO:tasks.workunit.client.0.vm04.stdout:ID NAME SOURCE DELETED_AT STATUS PARENT 2026-03-25T15:31:28.885 INFO:tasks.workunit.client.0.vm04.stdout:21ffa34027de test1 USER Wed Mar 25 15:31:28 2026 expired at Wed Mar 25 15:31:28 2026 2026-03-25T15:31:28.885 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash ls 2026-03-25T15:31:28.885 INFO:tasks.workunit.client.0.vm04.stderr:++ cut -d ' ' -f 1 2026-03-25T15:31:28.913 INFO:tasks.workunit.client.0.vm04.stderr:+ ID=21ffa34027de 2026-03-25T15:31:28.913 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash rm 21ffa34027de 2026-03-25T15:31:28.961 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:31:28.964 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv test2 2026-03-25T15:31:29.017 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash ls 2026-03-25T15:31:29.017 INFO:tasks.workunit.client.0.vm04.stderr:++ cut -d ' ' -f 1 2026-03-25T15:31:29.046 INFO:tasks.workunit.client.0.vm04.stderr:+ ID=22029179f42c 2026-03-25T15:31:29.046 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info --image-id 22029179f42c 2026-03-25T15:31:29.046 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'rbd image '\''test2'\''' 2026-03-25T15:31:29.080 INFO:tasks.workunit.client.0.vm04.stdout:rbd image 'test2': 2026-03-25T15:31:29.080 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd children --image-id 22029179f42c 2026-03-25T15:31:29.080 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:31:29.080 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 0 2026-03-25T15:31:29.112 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-25T15:31:29.112 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash restore 22029179f42c 2026-03-25T15:31:29.149 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-25T15:31:29.149 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test2 2026-03-25T15:31:29.174 INFO:tasks.workunit.client.0.vm04.stdout:test2 2026-03-25T15:31:29.174 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-25T15:31:29.174 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:31:29.174 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 1 2026-03-25T15:31:29.200 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-25T15:31:29.200 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls -l 2026-03-25T15:31:29.200 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'test2.*2.*' 2026-03-25T15:31:29.231 INFO:tasks.workunit.client.0.vm04.stdout:test2 1 MiB 2 2026-03-25T15:31:29.232 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv test2 --expires-at '3600 sec' 2026-03-25T15:31:29.280 INFO:tasks.workunit.client.0.vm04.stdout:rbd: image test2 will expire at 2026-03-25T16:31:29.257643+0000 2026-03-25T15:31:29.284 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:29.284 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test2 2026-03-25T15:31:29.309 INFO:tasks.workunit.client.0.vm04.stdout:22029179f42c test2 2026-03-25T15:31:29.309 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:29.309 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:31:29.310 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 1 2026-03-25T15:31:29.336 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-25T15:31:29.336 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls -l 2026-03-25T15:31:29.336 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'test2.*USER.*protected until' 2026-03-25T15:31:29.367 INFO:tasks.workunit.client.0.vm04.stdout:22029179f42c test2 USER Wed Mar 25 15:31:29 2026 protected until Wed Mar 25 16:31:29 2026 2026-03-25T15:31:29.367 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash rm 22029179f42c 2026-03-25T15:31:29.367 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'Deferment time has not expired' 2026-03-25T15:31:29.394 INFO:tasks.workunit.client.0.vm04.stdout:Deferment time has not expired, please use --force if you really want to remove the image 2026-03-25T15:31:29.394 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash rm --image-id 22029179f42c --force 2026-03-25T15:31:29.445 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:31:29.448 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 -s 1 test1 2026-03-25T15:31:29.491 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create test1@snap1 2026-03-25T15:31:29.755 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-25T15:31:29.761 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap protect test1@snap1 2026-03-25T15:31:29.797 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone test1@snap1 clone 2026-03-25T15:31:29.850 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv test1 2026-03-25T15:31:29.907 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:29.907 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test1 2026-03-25T15:31:29.934 INFO:tasks.workunit.client.0.vm04.stdout:2259deaa0ed4 test1 2026-03-25T15:31:29.934 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:29.934 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:31:29.934 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 1 2026-03-25T15:31:29.961 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-25T15:31:29.961 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls -l 2026-03-25T15:31:29.961 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'test1.*USER.*' 2026-03-25T15:31:29.994 INFO:tasks.workunit.client.0.vm04.stdout:2259deaa0ed4 test1 USER Wed Mar 25 15:31:29 2026 expired at Wed Mar 25 15:31:29 2026 2026-03-25T15:31:29.995 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls -l 2026-03-25T15:31:29.995 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -v 'protected until' 2026-03-25T15:31:30.028 INFO:tasks.workunit.client.0.vm04.stdout:ID NAME SOURCE DELETED_AT STATUS PARENT 2026-03-25T15:31:30.028 INFO:tasks.workunit.client.0.vm04.stdout:2259deaa0ed4 test1 USER Wed Mar 25 15:31:29 2026 expired at Wed Mar 25 15:31:29 2026 2026-03-25T15:31:30.028 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash ls 2026-03-25T15:31:30.028 INFO:tasks.workunit.client.0.vm04.stderr:++ cut -d ' ' -f 1 2026-03-25T15:31:30.057 INFO:tasks.workunit.client.0.vm04.stderr:+ ID=2259deaa0ed4 2026-03-25T15:31:30.057 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap ls --image-id 2259deaa0ed4 2026-03-25T15:31:30.057 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -v SNAPID 2026-03-25T15:31:30.057 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:31:30.057 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 1 2026-03-25T15:31:30.089 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-25T15:31:30.089 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap ls --image-id 2259deaa0ed4 2026-03-25T15:31:30.089 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '.*snap1.*' 2026-03-25T15:31:30.123 INFO:tasks.workunit.client.0.vm04.stdout: 18 snap1 1 MiB yes Wed Mar 25 15:31:29 2026 2026-03-25T15:31:30.123 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd children --image-id 2259deaa0ed4 2026-03-25T15:31:30.123 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:31:30.123 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 1 2026-03-25T15:31:30.161 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-25T15:31:30.161 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd children --image-id 2259deaa0ed4 2026-03-25T15:31:30.161 INFO:tasks.workunit.client.0.vm04.stderr:+ grep clone 2026-03-25T15:31:30.197 INFO:tasks.workunit.client.0.vm04.stdout:rbd/clone 2026-03-25T15:31:30.197 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm clone 2026-03-25T15:31:30.268 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:31:30.267+0000 7fdefe76e640 0 -- 192.168.123.104:0/957800852 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x565255ad80e0 msgr2=0x565255ab5810 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:31:30.273 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:31:30.277 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap unprotect --image-id 2259deaa0ed4 --snap snap1 2026-03-25T15:31:30.316 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm --image-id 2259deaa0ed4 --snap snap1 2026-03-25T15:31:30.756 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-25T15:31:30.764 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap ls --image-id 2259deaa0ed4 2026-03-25T15:31:30.764 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -v SNAPID 2026-03-25T15:31:30.764 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:31:30.764 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 0 2026-03-25T15:31:30.800 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-25T15:31:30.800 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash restore 2259deaa0ed4 2026-03-25T15:31:30.842 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create test1@snap1 2026-03-25T15:31:31.713 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-25T15:31:31.720 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create test1@snap2 2026-03-25T15:31:32.770 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-25T15:31:32.778 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap ls --image-id 2259deaa0ed4 2026-03-25T15:31:32.778 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -v SNAPID 2026-03-25T15:31:32.778 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:31:32.778 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 2 2026-03-25T15:31:32.811 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-25T15:31:32.812 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap purge --image-id 2259deaa0ed4 2026-03-25T15:31:34.785 INFO:tasks.workunit.client.0.vm04.stderr: Removing all snapshots: 50% complete... Removing all snapshots: 100% complete... Removing all snapshots: 100% complete...done. 2026-03-25T15:31:34.795 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap ls --image-id 2259deaa0ed4 2026-03-25T15:31:34.795 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -v SNAPID 2026-03-25T15:31:34.795 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:31:34.795 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 0 2026-03-25T15:31:35.017 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:31:35.016+0000 7f28c7887640 0 --2- 192.168.123.104:0/278711337 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f28b40022f0 0x7f28b40026e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 2026-03-25T15:31:35.028 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-25T15:31:35.028 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm --rbd_move_to_trash_on_remove=true --rbd_move_to_trash_on_remove_expire_seconds=3600 test1 2026-03-25T15:31:35.076 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:31:35.080 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:35.080 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test1 2026-03-25T15:31:35.106 INFO:tasks.workunit.client.0.vm04.stdout:2259deaa0ed4 test1 2026-03-25T15:31:35.106 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:35.106 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:31:35.106 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 1 2026-03-25T15:31:35.132 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-25T15:31:35.132 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls -l 2026-03-25T15:31:35.132 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'test1.*USER.*protected until' 2026-03-25T15:31:35.163 INFO:tasks.workunit.client.0.vm04.stdout:2259deaa0ed4 test1 USER Wed Mar 25 15:31:35 2026 protected until Wed Mar 25 16:31:35 2026 2026-03-25T15:31:35.163 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash rm 2259deaa0ed4 2026-03-25T15:31:35.163 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'Deferment time has not expired' 2026-03-25T15:31:35.188 INFO:tasks.workunit.client.0.vm04.stdout:Deferment time has not expired, please use --force if you really want to remove the image 2026-03-25T15:31:35.188 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash rm --image-id 2259deaa0ed4 --force 2026-03-25T15:31:35.234 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:31:35.237 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-25T15:31:35.237 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:35.312 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:35.388 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:35.471 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:35.550 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:35.632 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:35.710 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:35.786 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:35.864 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:35.945 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:36.223 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:36.298 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:36.378 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:36.457 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:36.535 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:36.610 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:36.690 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:36.778 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:36.859 INFO:tasks.workunit.client.0.vm04.stderr:+ test_purge 2026-03-25T15:31:36.859 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing trash purge...' 2026-03-25T15:31:36.859 INFO:tasks.workunit.client.0.vm04.stdout:testing trash purge... 2026-03-25T15:31:36.859 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-25T15:31:36.859 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:36.935 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:37.008 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:37.084 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:37.159 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:37.233 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:37.307 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:37.379 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:37.454 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:37.528 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:37.602 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:37.679 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:37.753 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:37.825 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:37.902 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:37.976 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:38.046 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:38.117 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:31:38.188 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:38.188 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:31:38.188 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 0 2026-03-25T15:31:38.213 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-25T15:31:38.213 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-25T15:31:38.236 INFO:tasks.workunit.client.0.vm04.stderr: Removing images: 100% complete...done. 2026-03-25T15:31:38.239 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 256 testimg1 2026-03-25T15:31:38.276 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 256 testimg2 2026-03-25T15:31:38.313 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg1 2026-03-25T15:31:38.361 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg2 2026-03-25T15:31:38.413 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:38.413 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:31:38.413 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 2 2026-03-25T15:31:38.442 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-25T15:31:38.442 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-25T15:31:38.510 INFO:tasks.workunit.client.0.vm04.stderr: Removing images: 50% complete... Removing images: 100% complete... Removing images: 100% complete...done. 2026-03-25T15:31:38.514 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:38.514 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:31:38.514 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 0 2026-03-25T15:31:38.542 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-25T15:31:38.542 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 256 testimg1 2026-03-25T15:31:38.808 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 256 testimg2 2026-03-25T15:31:38.858 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg1 --expires-at '1 hour' 2026-03-25T15:31:38.906 INFO:tasks.workunit.client.0.vm04.stdout:rbd: image testimg1 will expire at 2026-03-25T16:31:38.885549+0000 2026-03-25T15:31:38.910 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg2 --expires-at '3 hours' 2026-03-25T15:31:38.956 INFO:tasks.workunit.client.0.vm04.stdout:rbd: image testimg2 will expire at 2026-03-25T18:31:38.938601+0000 2026-03-25T15:31:38.960 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:38.960 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:31:38.960 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 2 2026-03-25T15:31:38.986 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-25T15:31:38.987 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-25T15:31:39.010 INFO:tasks.workunit.client.0.vm04.stderr: Removing images: 100% complete...done. 2026-03-25T15:31:39.012 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:39.012 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:31:39.013 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 2 2026-03-25T15:31:39.038 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-25T15:31:39.038 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge --expired-before 'now + 2 hours' 2026-03-25T15:31:39.089 INFO:tasks.workunit.client.0.vm04.stderr: Removing images: 100% complete... Removing images: 100% complete...done. 2026-03-25T15:31:39.093 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:39.093 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:31:39.093 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 1 2026-03-25T15:31:39.120 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-25T15:31:39.120 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:39.120 INFO:tasks.workunit.client.0.vm04.stderr:+ grep testimg2 2026-03-25T15:31:39.145 INFO:tasks.workunit.client.0.vm04.stdout:239d8b63be3a testimg2 2026-03-25T15:31:39.146 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge --expired-before 'now + 4 hours' 2026-03-25T15:31:39.194 INFO:tasks.workunit.client.0.vm04.stderr: Removing images: 100% complete... Removing images: 100% complete...done. 2026-03-25T15:31:39.198 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:39.198 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:31:39.198 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 0 2026-03-25T15:31:39.225 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-25T15:31:39.225 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 256 testimg1 2026-03-25T15:31:39.261 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create testimg1@snap 2026-03-25T15:31:39.939 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-25T15:31:39.946 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 256 testimg2 2026-03-25T15:31:39.985 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 256 testimg3 2026-03-25T15:31:40.022 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg1 2026-03-25T15:31:40.272 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg2 2026-03-25T15:31:40.328 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg3 2026-03-25T15:31:40.379 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:40.379 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:31:40.379 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 3 2026-03-25T15:31:40.405 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-25T15:31:40.405 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-25T15:31:40.405 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'some expired images could not be removed' 2026-03-25T15:31:40.513 INFO:tasks.workunit.client.0.vm04.stdout:rbd: some expired images could not be removed 2026-03-25T15:31:40.514 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:40.514 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:31:40.514 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 1 2026-03-25T15:31:40.540 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-25T15:31:40.540 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:40.540 INFO:tasks.workunit.client.0.vm04.stderr:+ grep testimg1 2026-03-25T15:31:40.768 INFO:tasks.workunit.client.0.vm04.stdout:23bae15f4670 testimg1 2026-03-25T15:31:40.768 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash ls 2026-03-25T15:31:40.768 INFO:tasks.workunit.client.0.vm04.stderr:++ awk '{ print $1 }' 2026-03-25T15:31:40.796 INFO:tasks.workunit.client.0.vm04.stderr:+ ID=23bae15f4670 2026-03-25T15:31:40.796 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap purge --image-id 23bae15f4670 2026-03-25T15:31:40.940 INFO:tasks.workunit.client.0.vm04.stderr: Removing all snapshots: 100% complete... Removing all snapshots: 100% complete...done. 2026-03-25T15:31:40.947 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-25T15:31:40.994 INFO:tasks.workunit.client.0.vm04.stderr: Removing images: 100% complete... Removing images: 100% complete...done. 2026-03-25T15:31:40.997 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:40.998 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:31:40.998 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 0 2026-03-25T15:31:41.024 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-25T15:31:41.024 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 256 testimg1 2026-03-25T15:31:41.065 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 256 testimg2 2026-03-25T15:31:41.104 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create testimg2@snap 2026-03-25T15:31:41.718 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-25T15:31:41.725 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 256 testimg3 2026-03-25T15:31:41.767 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg1 2026-03-25T15:31:41.821 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg2 2026-03-25T15:31:41.867 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg3 2026-03-25T15:31:41.917 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:41.917 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:31:41.917 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 3 2026-03-25T15:31:41.942 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-25T15:31:41.942 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-25T15:31:41.942 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'some expired images could not be removed' 2026-03-25T15:31:42.034 INFO:tasks.workunit.client.0.vm04.stdout:rbd: some expired images could not be removed 2026-03-25T15:31:42.034 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:42.034 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:31:42.034 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 1 2026-03-25T15:31:42.060 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-25T15:31:42.060 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:42.060 INFO:tasks.workunit.client.0.vm04.stderr:+ grep testimg2 2026-03-25T15:31:42.086 INFO:tasks.workunit.client.0.vm04.stdout:23e9fadb4e1e testimg2 2026-03-25T15:31:42.087 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash ls 2026-03-25T15:31:42.087 INFO:tasks.workunit.client.0.vm04.stderr:++ awk '{ print $1 }' 2026-03-25T15:31:42.113 INFO:tasks.workunit.client.0.vm04.stderr:+ ID=23e9fadb4e1e 2026-03-25T15:31:42.113 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap purge --image-id 23e9fadb4e1e 2026-03-25T15:31:42.946 INFO:tasks.workunit.client.0.vm04.stderr: Removing all snapshots: 100% complete... Removing all snapshots: 100% complete...done. 2026-03-25T15:31:42.953 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-25T15:31:43.001 INFO:tasks.workunit.client.0.vm04.stderr: Removing images: 100% complete... Removing images: 100% complete...done. 2026-03-25T15:31:43.004 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:43.004 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:31:43.005 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 0 2026-03-25T15:31:43.032 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-25T15:31:43.032 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 256 testimg1 2026-03-25T15:31:43.069 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 256 testimg2 2026-03-25T15:31:43.107 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 256 testimg3 2026-03-25T15:31:43.146 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create testimg3@snap 2026-03-25T15:31:43.953 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-25T15:31:43.961 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg1 2026-03-25T15:31:44.010 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg2 2026-03-25T15:31:44.058 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg3 2026-03-25T15:31:44.110 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:44.110 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:31:44.110 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 3 2026-03-25T15:31:44.134 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-25T15:31:44.134 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-25T15:31:44.134 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'some expired images could not be removed' 2026-03-25T15:31:44.228 INFO:tasks.workunit.client.0.vm04.stdout:rbd: some expired images could not be removed 2026-03-25T15:31:44.228 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:44.228 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:31:44.228 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 1 2026-03-25T15:31:44.254 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-25T15:31:44.254 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:44.254 INFO:tasks.workunit.client.0.vm04.stderr:+ grep testimg3 2026-03-25T15:31:44.278 INFO:tasks.workunit.client.0.vm04.stdout:241455076269 testimg3 2026-03-25T15:31:44.278 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash ls 2026-03-25T15:31:44.278 INFO:tasks.workunit.client.0.vm04.stderr:++ awk '{ print $1 }' 2026-03-25T15:31:44.305 INFO:tasks.workunit.client.0.vm04.stderr:+ ID=241455076269 2026-03-25T15:31:44.305 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap purge --image-id 241455076269 2026-03-25T15:31:45.017 INFO:tasks.workunit.client.0.vm04.stderr: Removing all snapshots: 100% complete... Removing all snapshots: 100% complete...done. 2026-03-25T15:31:45.025 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-25T15:31:45.070 INFO:tasks.workunit.client.0.vm04.stderr: Removing images: 100% complete... Removing images: 100% complete...done. 2026-03-25T15:31:45.073 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:45.074 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:31:45.074 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 0 2026-03-25T15:31:45.099 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-25T15:31:45.099 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 256 testimg1 2026-03-25T15:31:45.134 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create testimg1@snap 2026-03-25T15:31:46.031 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-25T15:31:46.038 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format=2 testimg1@snap testimg2 2026-03-25T15:31:46.090 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm testimg1@snap 2026-03-25T15:31:46.126 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-25T15:31:46.133 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 256 testimg3 2026-03-25T15:31:46.172 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create testimg2@snap 2026-03-25T15:31:46.721 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-25T15:31:46.730 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format=2 testimg2@snap testimg4 2026-03-25T15:31:46.788 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format=2 testimg2@snap testimg5 2026-03-25T15:31:46.846 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm testimg2@snap 2026-03-25T15:31:46.888 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-25T15:31:46.895 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create testimg4@snap 2026-03-25T15:31:47.719 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-25T15:31:47.726 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format=2 testimg4@snap testimg6 2026-03-25T15:31:47.789 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm testimg4@snap 2026-03-25T15:31:47.835 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-25T15:31:47.842 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg1 2026-03-25T15:31:47.906 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg2 2026-03-25T15:31:47.969 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg3 2026-03-25T15:31:48.022 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg4 2026-03-25T15:31:48.080 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:48.080 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:31:48.080 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 4 2026-03-25T15:31:48.109 INFO:tasks.workunit.client.0.vm04.stdout:4 2026-03-25T15:31:48.109 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-25T15:31:48.109 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'some expired images could not be removed' 2026-03-25T15:31:48.281 INFO:tasks.workunit.client.0.vm04.stdout:rbd: some expired images could not be removed 2026-03-25T15:31:48.282 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:48.282 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:31:48.282 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 3 2026-03-25T15:31:48.309 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-25T15:31:48.309 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:48.309 INFO:tasks.workunit.client.0.vm04.stderr:+ grep testimg1 2026-03-25T15:31:48.337 INFO:tasks.workunit.client.0.vm04.stdout:243ac449f4e4 testimg1 2026-03-25T15:31:48.337 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:48.337 INFO:tasks.workunit.client.0.vm04.stderr:+ grep testimg2 2026-03-25T15:31:48.364 INFO:tasks.workunit.client.0.vm04.stdout:243f8132cae testimg2 2026-03-25T15:31:48.364 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:48.364 INFO:tasks.workunit.client.0.vm04.stderr:+ grep testimg4 2026-03-25T15:31:48.390 INFO:tasks.workunit.client.0.vm04.stdout:244a98356c0d testimg4 2026-03-25T15:31:48.391 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg6 2026-03-25T15:31:48.445 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:48.445 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:31:48.445 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 4 2026-03-25T15:31:48.472 INFO:tasks.workunit.client.0.vm04.stdout:4 2026-03-25T15:31:48.473 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-25T15:31:48.473 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'some expired images could not be removed' 2026-03-25T15:31:48.819 INFO:tasks.workunit.client.0.vm04.stdout:rbd: some expired images could not be removed 2026-03-25T15:31:48.819 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:48.819 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:31:48.819 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 2 2026-03-25T15:31:48.844 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-25T15:31:48.845 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:48.845 INFO:tasks.workunit.client.0.vm04.stderr:+ grep testimg1 2026-03-25T15:31:48.871 INFO:tasks.workunit.client.0.vm04.stdout:243ac449f4e4 testimg1 2026-03-25T15:31:48.871 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:48.871 INFO:tasks.workunit.client.0.vm04.stderr:+ grep testimg2 2026-03-25T15:31:48.898 INFO:tasks.workunit.client.0.vm04.stdout:243f8132cae testimg2 2026-03-25T15:31:48.899 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg5 2026-03-25T15:31:48.946 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:48.946 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:31:48.946 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 3 2026-03-25T15:31:48.972 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-25T15:31:48.972 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-25T15:31:49.049 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:31:49.048+0000 7ff077154640 0 -- 192.168.123.104:0/2016083421 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x55d016765410 msgr2=0x55d016743560 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:31:49.057 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:31:49.055+0000 7ff077154640 0 -- 192.168.123.104:0/2016083421 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7ff05405f140 msgr2=0x7ff05407f540 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:31:50.884 INFO:tasks.workunit.client.0.vm04.stderr: Removing images: 33% complete... Removing images: 66% complete... Removing images: 100% complete... Removing images: 100% complete...done. 2026-03-25T15:31:50.888 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:50.888 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:31:50.888 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 0 2026-03-25T15:31:50.913 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-25T15:31:50.913 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 256 testimg1 2026-03-25T15:31:50.950 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create testimg1@snap 2026-03-25T15:31:51.720 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-25T15:31:51.728 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format=2 testimg1@snap testimg2 2026-03-25T15:31:51.791 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm testimg1@snap 2026-03-25T15:31:51.833 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-25T15:31:51.842 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 256 testimg3 2026-03-25T15:31:51.885 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create testimg3@snap 2026-03-25T15:31:52.853 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-25T15:31:52.862 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create testimg2@snap 2026-03-25T15:31:53.854 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-25T15:31:53.862 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format=2 testimg2@snap testimg4 2026-03-25T15:31:53.926 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format=2 testimg2@snap testimg5 2026-03-25T15:31:53.984 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm testimg2@snap 2026-03-25T15:31:54.026 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-25T15:31:54.034 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create testimg4@snap 2026-03-25T15:31:54.859 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-25T15:31:54.867 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format=2 testimg4@snap testimg6 2026-03-25T15:31:54.929 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm testimg4@snap 2026-03-25T15:31:54.974 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-25T15:31:54.982 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg1 2026-03-25T15:31:55.030 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg2 2026-03-25T15:31:55.080 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg3 2026-03-25T15:31:55.133 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg4 2026-03-25T15:31:55.185 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:55.186 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:31:55.186 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 4 2026-03-25T15:31:55.214 INFO:tasks.workunit.client.0.vm04.stdout:4 2026-03-25T15:31:55.214 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-25T15:31:55.214 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'some expired images could not be removed' 2026-03-25T15:31:55.305 INFO:tasks.workunit.client.0.vm04.stdout:rbd: some expired images could not be removed 2026-03-25T15:31:55.305 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:55.306 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:31:55.306 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 4 2026-03-25T15:31:55.332 INFO:tasks.workunit.client.0.vm04.stdout:4 2026-03-25T15:31:55.332 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg6 2026-03-25T15:31:55.379 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:55.379 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:31:55.379 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 5 2026-03-25T15:31:55.404 INFO:tasks.workunit.client.0.vm04.stdout:5 2026-03-25T15:31:55.405 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-25T15:31:55.405 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'some expired images could not be removed' 2026-03-25T15:31:56.001 INFO:tasks.workunit.client.0.vm04.stdout:rbd: some expired images could not be removed 2026-03-25T15:31:56.002 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:56.002 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:31:56.002 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 3 2026-03-25T15:31:56.029 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-25T15:31:56.030 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:56.030 INFO:tasks.workunit.client.0.vm04.stderr:+ grep testimg1 2026-03-25T15:31:56.059 INFO:tasks.workunit.client.0.vm04.stdout:2495126c2bde testimg1 2026-03-25T15:31:56.059 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:56.059 INFO:tasks.workunit.client.0.vm04.stderr:+ grep testimg2 2026-03-25T15:31:56.084 INFO:tasks.workunit.client.0.vm04.stdout:249bbc47210d testimg2 2026-03-25T15:31:56.084 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:56.084 INFO:tasks.workunit.client.0.vm04.stderr:+ grep testimg3 2026-03-25T15:31:56.111 INFO:tasks.workunit.client.0.vm04.stdout:24a17d42e23b testimg3 2026-03-25T15:31:56.112 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg5 2026-03-25T15:31:56.166 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:56.166 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:31:56.166 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 4 2026-03-25T15:31:56.195 INFO:tasks.workunit.client.0.vm04.stdout:4 2026-03-25T15:31:56.196 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-25T15:31:56.196 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'some expired images could not be removed' 2026-03-25T15:31:57.830 INFO:tasks.workunit.client.0.vm04.stdout:rbd: some expired images could not be removed 2026-03-25T15:31:57.831 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:57.831 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:31:57.831 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 1 2026-03-25T15:31:57.857 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-25T15:31:57.857 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:57.857 INFO:tasks.workunit.client.0.vm04.stderr:+ grep testimg3 2026-03-25T15:31:57.885 INFO:tasks.workunit.client.0.vm04.stdout:24a17d42e23b testimg3 2026-03-25T15:31:57.885 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash ls 2026-03-25T15:31:57.885 INFO:tasks.workunit.client.0.vm04.stderr:++ awk '{ print $1 }' 2026-03-25T15:31:57.914 INFO:tasks.workunit.client.0.vm04.stderr:+ ID=24a17d42e23b 2026-03-25T15:31:57.914 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap purge --image-id 24a17d42e23b 2026-03-25T15:31:58.722 INFO:tasks.workunit.client.0.vm04.stderr: Removing all snapshots: 100% complete... Removing all snapshots: 100% complete...done. 2026-03-25T15:31:58.730 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-25T15:31:58.783 INFO:tasks.workunit.client.0.vm04.stderr: Removing images: 100% complete... Removing images: 100% complete...done. 2026-03-25T15:31:58.787 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:31:58.787 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:31:58.787 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 0 2026-03-25T15:31:58.815 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-25T15:31:58.815 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 256 testimg1 2026-03-25T15:31:58.855 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create testimg1@snap 2026-03-25T15:31:59.729 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-25T15:31:59.736 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format=2 testimg1@snap testimg2 2026-03-25T15:31:59.788 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm testimg1@snap 2026-03-25T15:31:59.823 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-25T15:31:59.829 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 256 testimg3 2026-03-25T15:31:59.870 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create testimg2@snap 2026-03-25T15:32:00.739 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-25T15:32:00.748 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format=2 testimg2@snap testimg4 2026-03-25T15:32:00.807 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format=2 testimg2@snap testimg5 2026-03-25T15:32:00.864 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm testimg2@snap 2026-03-25T15:32:00.902 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-25T15:32:00.908 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create testimg4@snap 2026-03-25T15:32:01.723 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-25T15:32:01.733 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format=2 testimg4@snap testimg6 2026-03-25T15:32:01.796 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm testimg4@snap 2026-03-25T15:32:01.841 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-25T15:32:01.849 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm --rbd_move_parent_to_trash_on_remove=true testimg1 2026-03-25T15:32:01.909 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:32:01.913 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm --rbd_move_parent_to_trash_on_remove=true testimg2 2026-03-25T15:32:01.969 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:32:01.972 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg3 2026-03-25T15:32:02.023 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm --rbd_move_parent_to_trash_on_remove=true testimg4 2026-03-25T15:32:02.077 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:32:02.080 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:32:02.081 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:32:02.081 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 4 2026-03-25T15:32:02.106 INFO:tasks.workunit.client.0.vm04.stdout:4 2026-03-25T15:32:02.106 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-25T15:32:02.106 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'some expired images could not be removed' 2026-03-25T15:32:02.237 INFO:tasks.workunit.client.0.vm04.stdout:rbd: some expired images could not be removed 2026-03-25T15:32:02.237 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:32:02.237 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:32:02.237 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 3 2026-03-25T15:32:02.262 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-25T15:32:02.263 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:32:02.263 INFO:tasks.workunit.client.0.vm04.stderr:+ grep testimg1 2026-03-25T15:32:02.288 INFO:tasks.workunit.client.0.vm04.stdout:2500be77117c testimg1 2026-03-25T15:32:02.288 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:32:02.288 INFO:tasks.workunit.client.0.vm04.stderr:+ grep testimg2 2026-03-25T15:32:02.315 INFO:tasks.workunit.client.0.vm04.stdout:25065b86ecfe testimg2 2026-03-25T15:32:02.315 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:32:02.315 INFO:tasks.workunit.client.0.vm04.stderr:+ grep testimg4 2026-03-25T15:32:02.342 INFO:tasks.workunit.client.0.vm04.stdout:25126a0dad90 testimg4 2026-03-25T15:32:02.342 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg6 2026-03-25T15:32:02.391 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:32:02.391 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:32:02.391 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 4 2026-03-25T15:32:02.417 INFO:tasks.workunit.client.0.vm04.stdout:4 2026-03-25T15:32:02.417 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-25T15:32:02.417 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'some expired images could not be removed' 2026-03-25T15:32:02.810 INFO:tasks.workunit.client.0.vm04.stdout:rbd: some expired images could not be removed 2026-03-25T15:32:02.810 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:32:02.810 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:32:02.810 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 2 2026-03-25T15:32:02.837 INFO:tasks.workunit.client.0.vm04.stdout:2 2026-03-25T15:32:02.837 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:32:02.837 INFO:tasks.workunit.client.0.vm04.stderr:+ grep testimg1 2026-03-25T15:32:02.865 INFO:tasks.workunit.client.0.vm04.stdout:2500be77117c testimg1 2026-03-25T15:32:02.866 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:32:02.866 INFO:tasks.workunit.client.0.vm04.stderr:+ grep testimg2 2026-03-25T15:32:02.893 INFO:tasks.workunit.client.0.vm04.stdout:25065b86ecfe testimg2 2026-03-25T15:32:02.893 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg5 2026-03-25T15:32:02.949 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:32:02.949 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:32:02.949 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 3 2026-03-25T15:32:02.976 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-25T15:32:02.976 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-25T15:32:03.052 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:32:03.050+0000 7f81f086c640 0 -- 192.168.123.104:0/296338462 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x55a18454a410 msgr2=0x55a184528560 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:32:03.059 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:32:03.057+0000 7f81f086c640 0 -- 192.168.123.104:0/296338462 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7f81d005f140 msgr2=0x7f81d007f540 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-25T15:32:04.785 INFO:tasks.workunit.client.0.vm04.stderr: Removing images: 33% complete... Removing images: 66% complete... Removing images: 100% complete... Removing images: 100% complete...done. 2026-03-25T15:32:04.789 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:32:04.789 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:32:04.789 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 0 2026-03-25T15:32:04.816 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-25T15:32:04.816 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 256 testimg1 2026-03-25T15:32:04.857 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create testimg1@snap 2026-03-25T15:32:05.754 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-25T15:32:05.762 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format=2 testimg1@snap testimg2 2026-03-25T15:32:05.815 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm testimg1@snap 2026-03-25T15:32:05.850 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-25T15:32:05.856 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 256 testimg3 2026-03-25T15:32:05.894 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create testimg3@snap 2026-03-25T15:32:06.725 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-25T15:32:06.732 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create testimg2@snap 2026-03-25T15:32:07.766 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-25T15:32:07.776 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format=2 testimg2@snap testimg4 2026-03-25T15:32:07.836 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format=2 testimg2@snap testimg5 2026-03-25T15:32:07.897 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm testimg2@snap 2026-03-25T15:32:07.938 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-25T15:32:07.946 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create testimg4@snap 2026-03-25T15:32:08.764 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-25T15:32:08.771 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format=2 testimg4@snap testimg6 2026-03-25T15:32:08.840 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm testimg4@snap 2026-03-25T15:32:08.887 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-25T15:32:08.896 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm --rbd_move_parent_to_trash_on_remove=true testimg1 2026-03-25T15:32:08.961 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:32:08.965 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm --rbd_move_parent_to_trash_on_remove=true testimg2 2026-03-25T15:32:09.039 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:32:09.043 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg3 2026-03-25T15:32:09.095 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm --rbd_move_parent_to_trash_on_remove=true testimg4 2026-03-25T15:32:09.156 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:32:09.160 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:32:09.160 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:32:09.160 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 4 2026-03-25T15:32:09.186 INFO:tasks.workunit.client.0.vm04.stdout:4 2026-03-25T15:32:09.186 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-25T15:32:09.186 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'some expired images could not be removed' 2026-03-25T15:32:09.289 INFO:tasks.workunit.client.0.vm04.stdout:rbd: some expired images could not be removed 2026-03-25T15:32:09.289 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:32:09.289 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:32:09.289 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 4 2026-03-25T15:32:09.317 INFO:tasks.workunit.client.0.vm04.stdout:4 2026-03-25T15:32:09.317 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg6 2026-03-25T15:32:09.371 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:32:09.371 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:32:09.371 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 5 2026-03-25T15:32:09.398 INFO:tasks.workunit.client.0.vm04.stdout:5 2026-03-25T15:32:09.398 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-25T15:32:09.398 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'some expired images could not be removed' 2026-03-25T15:32:09.868 INFO:tasks.workunit.client.0.vm04.stdout:rbd: some expired images could not be removed 2026-03-25T15:32:09.869 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:32:09.869 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:32:09.869 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 3 2026-03-25T15:32:09.902 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-25T15:32:09.903 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:32:09.903 INFO:tasks.workunit.client.0.vm04.stderr:+ grep testimg1 2026-03-25T15:32:09.931 INFO:tasks.workunit.client.0.vm04.stdout:255c38a4a49f testimg1 2026-03-25T15:32:09.931 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:32:09.931 INFO:tasks.workunit.client.0.vm04.stderr:+ grep testimg2 2026-03-25T15:32:09.961 INFO:tasks.workunit.client.0.vm04.stdout:256279c5f6d2 testimg2 2026-03-25T15:32:09.961 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:32:09.961 INFO:tasks.workunit.client.0.vm04.stderr:+ grep testimg3 2026-03-25T15:32:09.991 INFO:tasks.workunit.client.0.vm04.stdout:25683a87cd73 testimg3 2026-03-25T15:32:09.991 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv testimg5 2026-03-25T15:32:10.049 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:32:10.049 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:32:10.049 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 4 2026-03-25T15:32:10.080 INFO:tasks.workunit.client.0.vm04.stdout:4 2026-03-25T15:32:10.080 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-25T15:32:10.081 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'some expired images could not be removed' 2026-03-25T15:32:11.792 INFO:tasks.workunit.client.0.vm04.stdout:rbd: some expired images could not be removed 2026-03-25T15:32:11.793 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:32:11.793 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:32:11.793 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 1 2026-03-25T15:32:11.820 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-25T15:32:11.821 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:32:11.821 INFO:tasks.workunit.client.0.vm04.stderr:+ grep testimg3 2026-03-25T15:32:11.848 INFO:tasks.workunit.client.0.vm04.stdout:25683a87cd73 testimg3 2026-03-25T15:32:11.848 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash ls 2026-03-25T15:32:11.848 INFO:tasks.workunit.client.0.vm04.stderr:++ awk '{ print $1 }' 2026-03-25T15:32:11.879 INFO:tasks.workunit.client.0.vm04.stderr:+ ID=25683a87cd73 2026-03-25T15:32:11.879 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap purge --image-id 25683a87cd73 2026-03-25T15:32:12.794 INFO:tasks.workunit.client.0.vm04.stderr: Removing all snapshots: 100% complete... Removing all snapshots: 100% complete...done. 2026-03-25T15:32:12.818 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge 2026-03-25T15:32:12.913 INFO:tasks.workunit.client.0.vm04.stderr: Removing images: 100% complete... Removing images: 100% complete...done. 2026-03-25T15:32:12.917 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls 2026-03-25T15:32:12.917 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:32:12.917 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 0 2026-03-25T15:32:12.945 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-25T15:32:12.946 INFO:tasks.workunit.client.0.vm04.stderr:+ test_deep_copy_clone 2026-03-25T15:32:12.946 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing deep copy clone...' 2026-03-25T15:32:12.946 INFO:tasks.workunit.client.0.vm04.stdout:testing deep copy clone... 2026-03-25T15:32:12.946 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-25T15:32:12.946 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:13.021 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:13.107 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:13.183 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:13.255 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:13.327 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:13.401 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:13.477 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:13.550 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:13.620 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:13.692 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:13.772 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:13.861 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:13.940 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:14.018 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:14.091 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:14.166 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:14.244 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:14.319 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create testimg1 --image-format 2 --size 256 2026-03-25T15:32:14.359 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create testimg1 --snap=snap1 2026-03-25T15:32:14.784 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-25T15:32:14.796 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap protect testimg1@snap1 2026-03-25T15:32:14.837 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone testimg1@snap1 testimg2 2026-03-25T15:32:14.893 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create testimg2@snap2 2026-03-25T15:32:15.879 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-25T15:32:15.889 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd deep copy testimg2 testimg3 2026-03-25T15:32:15.968 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:32:15.967+0000 7fa710648640 0 -- 192.168.123.104:0/529945891 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x55e8e2398ac0 msgr2=0x55e8e24d7110 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:32:16.736 INFO:tasks.workunit.client.0.vm04.stderr: Image deep copy: 1% complete... Image deep copy: 3% complete... Image deep copy: 4% complete... Image deep copy: 6% complete... Image deep copy: 7% complete... Image deep copy: 9% complete... Image deep copy: 10% complete... Image deep copy: 12% complete... Image deep copy: 14% complete... Image deep copy: 15% complete... Image deep copy: 17% complete... Image deep copy: 18% complete... Image deep copy: 20% complete... Image deep copy: 21% complete... Image deep copy: 23% complete... Image deep copy: 25% complete... Image deep copy: 26% complete... Image deep copy: 28% complete... Image deep copy: 29% complete... Image deep copy: 31% complete... Image deep copy: 32% complete... Image deep copy: 34% complete... Image deep copy: 35% complete... Image deep copy: 37% complete... Image deep copy: 39% complete... Image deep copy: 40% complete... Image deep copy: 42% complete... Image deep copy: 43% complete... Image deep copy: 45% complete... Image deep copy: 46% complete... Image deep copy: 48% complete... Image deep copy: 50% complete... Image deep copy: 51% complete... Image deep copy: 53% complete... Image deep copy: 54% complete... Image deep copy: 56% complete... Image deep copy: 57% complete... Image deep copy: 59% complete... Image deep copy: 60% complete... Image deep copy: 62% complete... Image deep copy: 64% complete... Image deep copy: 65% complete... Image deep copy: 67% complete... Image deep copy: 68% complete... Image deep copy: 70% complete... Image deep copy: 71% complete... Image deep copy: 73% complete... Image deep copy: 75% complete... Image deep copy: 76% complete... Image deep copy: 78% complete... Image deep copy: 79% complete... Image deep copy: 81% complete... Image deep copy: 82% complete... Image deep copy: 84% complete... Image deep copy: 85% complete... Image deep copy: 87% complete... Image deep copy: 89% complete... Image deep copy: 90% complete... Image deep copy: 92% complete... Image deep copy: 93% complete... Image deep copy: 95% complete... Image deep copy: 96% complete... Image deep copy: 98% complete... Image deep copy: 100% complete...2026-03-25T15:32:16.735+0000 7fa710648640 0 -- 192.168.123.104:0/529945891 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7fa6f005f170 msgr2=0x7fa6f007f570 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:32:16.739 INFO:tasks.workunit.client.0.vm04.stderr: Image deep copy: 100% complete...done. 2026-03-25T15:32:16.744 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg3 2026-03-25T15:32:16.744 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 256 MiB' 2026-03-25T15:32:16.785 INFO:tasks.workunit.client.0.vm04.stdout: size 256 MiB in 64 objects 2026-03-25T15:32:16.785 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg3 2026-03-25T15:32:16.785 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'parent: rbd/testimg1@snap1' 2026-03-25T15:32:16.821 INFO:tasks.workunit.client.0.vm04.stdout: parent: rbd/testimg1@snap1 2026-03-25T15:32:16.821 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap ls testimg3 2026-03-25T15:32:16.821 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -v SNAPID 2026-03-25T15:32:16.821 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:32:16.821 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 1 2026-03-25T15:32:16.859 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-25T15:32:16.859 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap ls testimg3 2026-03-25T15:32:16.859 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '.*snap2.*' 2026-03-25T15:32:16.900 INFO:tasks.workunit.client.0.vm04.stdout: 40 snap2 256 MiB Wed Mar 25 15:32:16 2026 2026-03-25T15:32:16.900 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg2 2026-03-25T15:32:16.900 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'features:.*deep-flatten' 2026-03-25T15:32:16.940 INFO:tasks.workunit.client.0.vm04.stdout: features: layering, exclusive-lock, object-map, fast-diff, deep-flatten 2026-03-25T15:32:16.940 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg3 2026-03-25T15:32:16.941 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'features:.*deep-flatten' 2026-03-25T15:32:16.978 INFO:tasks.workunit.client.0.vm04.stdout: features: layering, exclusive-lock, object-map, fast-diff, deep-flatten 2026-03-25T15:32:16.978 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd flatten testimg2 2026-03-25T15:32:17.021 INFO:tasks.workunit.client.0.vm04.stderr: Image flatten: 1% complete... Image flatten: 3% complete... Image flatten: 4% complete... Image flatten: 6% complete... Image flatten: 7% complete... Image flatten: 9% complete... Image flatten: 10% complete... Image flatten: 12% complete... Image flatten: 14% complete... Image flatten: 15% complete... Image flatten: 17% complete... Image flatten: 18% complete... Image flatten: 20% complete... Image flatten: 21% complete... Image flatten: 23% complete... Image flatten: 25% complete... Image flatten: 26% complete... Image flatten: 28% complete... Image flatten: 29% complete... Image flatten: 31% complete... Image flatten: 32% complete... Image flatten: 34% complete... Image flatten: 35% complete... Image flatten: 37% complete... Image flatten: 39% complete... Image flatten: 40% complete... Image flatten: 42% complete... Image flatten: 43% complete... Image flatten: 45% complete... Image flatten: 46% complete... Image flatten: 48% complete... Image flatten: 50% complete... Image flatten: 51% complete... Image flatten: 53% complete... Image flatten: 54% complete... Image flatten: 56% complete... Image flatten: 57% complete... Image flatten: 59% complete... Image flatten: 60% complete... Image flatten: 62% complete... Image flatten: 64% complete... Image flatten: 65% complete... Image flatten: 67% complete... Image flatten: 68% complete... Image flatten: 70% complete... Image flatten: 71% complete... Image flatten: 73% complete... Image flatten: 75% complete... Image flatten: 76% complete... Image flatten: 78% complete... Image flatten: 79% complete... Image flatten: 81% complete... Image flatten: 82% complete... Image flatten: 84% complete... Image flatten: 85% complete... Image flatten: 87% complete... Image flatten: 89% complete... Image flatten: 90% complete... Image flatten: 92% complete... Image flatten: 93% complete... Image flatten: 95% complete... Image flatten: 96% complete... Image flatten: 98% complete... Image flatten: 100% complete...done. 2026-03-25T15:32:17.030 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd flatten testimg3 2026-03-25T15:32:17.069 INFO:tasks.workunit.client.0.vm04.stderr: Image flatten: 1% complete... Image flatten: 3% complete... Image flatten: 4% complete... Image flatten: 6% complete... Image flatten: 7% complete... Image flatten: 9% complete... Image flatten: 10% complete... Image flatten: 12% complete... Image flatten: 14% complete... Image flatten: 15% complete... Image flatten: 17% complete... Image flatten: 18% complete... Image flatten: 20% complete... Image flatten: 21% complete... Image flatten: 23% complete... Image flatten: 25% complete... Image flatten: 26% complete... Image flatten: 28% complete... Image flatten: 29% complete... Image flatten: 31% complete... Image flatten: 32% complete... Image flatten: 34% complete... Image flatten: 35% complete... Image flatten: 37% complete... Image flatten: 39% complete... Image flatten: 40% complete... Image flatten: 42% complete... Image flatten: 43% complete... Image flatten: 45% complete... Image flatten: 46% complete... Image flatten: 48% complete... Image flatten: 50% complete... Image flatten: 51% complete... Image flatten: 53% complete... Image flatten: 54% complete... Image flatten: 56% complete... Image flatten: 57% complete... Image flatten: 59% complete... Image flatten: 60% complete... Image flatten: 62% complete... Image flatten: 64% complete... Image flatten: 65% complete... Image flatten: 67% complete... Image flatten: 68% complete... Image flatten: 70% complete... Image flatten: 71% complete... Image flatten: 73% complete... Image flatten: 75% complete... Image flatten: 76% complete... Image flatten: 78% complete... Image flatten: 79% complete... Image flatten: 81% complete... Image flatten: 82% complete... Image flatten: 84% complete... Image flatten: 85% complete... Image flatten: 87% complete... Image flatten: 89% complete... Image flatten: 90% complete... Image flatten: 92% complete... Image flatten: 93% complete... Image flatten: 95% complete... Image flatten: 96% complete... Image flatten: 98% complete... Image flatten: 100% complete...done. 2026-03-25T15:32:17.076 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap unprotect testimg1@snap1 2026-03-25T15:32:17.114 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap purge testimg2 2026-03-25T15:32:17.885 INFO:tasks.workunit.client.0.vm04.stderr: Removing all snapshots: 100% complete... Removing all snapshots: 100% complete...done. 2026-03-25T15:32:17.895 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap purge testimg3 2026-03-25T15:32:18.885 INFO:tasks.workunit.client.0.vm04.stderr: Removing all snapshots: 100% complete... Removing all snapshots: 100% complete...done. 2026-03-25T15:32:18.892 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm testimg2 2026-03-25T15:32:18.966 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 1% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 12% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 23% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 37% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 48% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 62% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 73% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 87% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 98% complete...2026-03-25T15:32:18.966+0000 7f3a024a8640 0 -- 192.168.123.104:0/2391757454 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x7f39dc008d20 msgr2=0x7f39dc0291a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:32:18.976 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:32:18.980 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm testimg3 2026-03-25T15:32:19.050 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 1% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 12% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 23% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 37% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 48% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 62% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 73% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 87% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 98% complete...2026-03-25T15:32:19.049+0000 7fea87781640 0 -- 192.168.123.104:0/3019550252 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x55974d691cc0 msgr2=0x55974d6e3840 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:32:19.052 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:32:19.057 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap protect testimg1@snap1 2026-03-25T15:32:19.092 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone testimg1@snap1 testimg2 2026-03-25T15:32:19.146 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create testimg2@snap2 2026-03-25T15:32:19.896 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-25T15:32:19.908 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd deep copy --flatten testimg2 testimg3 2026-03-25T15:32:21.099 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:32:21.099+0000 7f4c2698f640 0 -- 192.168.123.104:0/2436032542 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x55e0ba7acac0 msgr2=0x55e0ba8392b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:32:21.161 INFO:tasks.workunit.client.0.vm04.stderr: Image deep copy: 1% complete... Image deep copy: 3% complete... Image deep copy: 4% complete... Image deep copy: 6% complete... Image deep copy: 7% complete... Image deep copy: 9% complete... Image deep copy: 10% complete...2026-03-25T15:32:21.160+0000 7f4c2698f640 0 -- 192.168.123.104:0/2436032542 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7f4c0405f860 msgr2=0x7f4c0407fc40 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:32:22.023 INFO:tasks.workunit.client.0.vm04.stderr: Image deep copy: 12% complete... Image deep copy: 14% complete... Image deep copy: 15% complete... Image deep copy: 17% complete... Image deep copy: 18% complete... Image deep copy: 20% complete... Image deep copy: 21% complete... Image deep copy: 23% complete... Image deep copy: 25% complete... Image deep copy: 26% complete... Image deep copy: 28% complete... Image deep copy: 29% complete... Image deep copy: 31% complete... Image deep copy: 32% complete... Image deep copy: 34% complete... Image deep copy: 35% complete... Image deep copy: 37% complete... Image deep copy: 39% complete... Image deep copy: 40% complete... Image deep copy: 42% complete... Image deep copy: 43% complete... Image deep copy: 45% complete... Image deep copy: 46% complete... Image deep copy: 48% complete... Image deep copy: 50% complete... Image deep copy: 51% complete... Image deep copy: 53% complete... Image deep copy: 54% complete... Image deep copy: 56% complete... Image deep copy: 57% complete... Image deep copy: 59% complete... Image deep copy: 60% complete... Image deep copy: 62% complete... Image deep copy: 64% complete... Image deep copy: 65% complete... Image deep copy: 67% complete... Image deep copy: 68% complete... Image deep copy: 70% complete... Image deep copy: 71% complete... Image deep copy: 73% complete... Image deep copy: 75% complete... Image deep copy: 76% complete... Image deep copy: 78% complete... Image deep copy: 79% complete... Image deep copy: 81% complete... Image deep copy: 82% complete... Image deep copy: 84% complete... Image deep copy: 85% complete... Image deep copy: 87% complete... Image deep copy: 89% complete... Image deep copy: 90% complete... Image deep copy: 92% complete... Image deep copy: 93% complete... Image deep copy: 95% complete... Image deep copy: 96% complete... Image deep copy: 98% complete... Image deep copy: 100% complete... Image deep copy: 100% complete...done. 2026-03-25T15:32:22.030 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg3 2026-03-25T15:32:22.030 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'size 256 MiB' 2026-03-25T15:32:22.065 INFO:tasks.workunit.client.0.vm04.stdout: size 256 MiB in 64 objects 2026-03-25T15:32:22.065 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg3 2026-03-25T15:32:22.065 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -v parent: 2026-03-25T15:32:22.103 INFO:tasks.workunit.client.0.vm04.stdout:rbd image 'testimg3': 2026-03-25T15:32:22.103 INFO:tasks.workunit.client.0.vm04.stdout: size 256 MiB in 64 objects 2026-03-25T15:32:22.103 INFO:tasks.workunit.client.0.vm04.stdout: order 22 (4 MiB objects) 2026-03-25T15:32:22.103 INFO:tasks.workunit.client.0.vm04.stdout: snapshot_count: 1 2026-03-25T15:32:22.103 INFO:tasks.workunit.client.0.vm04.stdout: id: 2673a4522169 2026-03-25T15:32:22.103 INFO:tasks.workunit.client.0.vm04.stdout: block_name_prefix: rbd_data.2673a4522169 2026-03-25T15:32:22.103 INFO:tasks.workunit.client.0.vm04.stdout: format: 2 2026-03-25T15:32:22.103 INFO:tasks.workunit.client.0.vm04.stdout: features: layering, exclusive-lock, object-map, fast-diff, deep-flatten 2026-03-25T15:32:22.103 INFO:tasks.workunit.client.0.vm04.stdout: op_features: 2026-03-25T15:32:22.103 INFO:tasks.workunit.client.0.vm04.stdout: flags: 2026-03-25T15:32:22.103 INFO:tasks.workunit.client.0.vm04.stdout: create_timestamp: Wed Mar 25 15:32:19 2026 2026-03-25T15:32:22.103 INFO:tasks.workunit.client.0.vm04.stdout: access_timestamp: Wed Mar 25 15:32:19 2026 2026-03-25T15:32:22.103 INFO:tasks.workunit.client.0.vm04.stdout: modify_timestamp: Wed Mar 25 15:32:19 2026 2026-03-25T15:32:22.104 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap ls testimg3 2026-03-25T15:32:22.104 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -v SNAPID 2026-03-25T15:32:22.104 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:32:22.104 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 1 2026-03-25T15:32:22.143 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-25T15:32:22.143 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap ls testimg3 2026-03-25T15:32:22.143 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '.*snap2.*' 2026-03-25T15:32:22.181 INFO:tasks.workunit.client.0.vm04.stdout: 42 snap2 256 MiB Wed Mar 25 15:32:21 2026 2026-03-25T15:32:22.181 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info testimg2 2026-03-25T15:32:22.182 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'features:.*deep-flatten' 2026-03-25T15:32:22.222 INFO:tasks.workunit.client.0.vm04.stdout: features: layering, exclusive-lock, object-map, fast-diff, deep-flatten 2026-03-25T15:32:22.228 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd flatten testimg2 2026-03-25T15:32:22.267 INFO:tasks.workunit.client.0.vm04.stderr: Image flatten: 1% complete... Image flatten: 3% complete... Image flatten: 4% complete... Image flatten: 6% complete... Image flatten: 7% complete... Image flatten: 9% complete... Image flatten: 10% complete... Image flatten: 12% complete... Image flatten: 14% complete... Image flatten: 15% complete... Image flatten: 17% complete... Image flatten: 18% complete... Image flatten: 20% complete... Image flatten: 21% complete... Image flatten: 23% complete... Image flatten: 25% complete... Image flatten: 26% complete... Image flatten: 28% complete... Image flatten: 29% complete... Image flatten: 31% complete... Image flatten: 32% complete... Image flatten: 34% complete... Image flatten: 35% complete... Image flatten: 37% complete... Image flatten: 39% complete... Image flatten: 40% complete... Image flatten: 42% complete... Image flatten: 43% complete... Image flatten: 45% complete... Image flatten: 46% complete... Image flatten: 48% complete... Image flatten: 50% complete... Image flatten: 51% complete... Image flatten: 53% complete... Image flatten: 54% complete... Image flatten: 56% complete... Image flatten: 57% complete... Image flatten: 59% complete... Image flatten: 60% complete... Image flatten: 62% complete... Image flatten: 64% complete... Image flatten: 65% complete... Image flatten: 67% complete... Image flatten: 68% complete... Image flatten: 70% complete... Image flatten: 71% complete... Image flatten: 73% complete... Image flatten: 75% complete... Image flatten: 76% complete... Image flatten: 78% complete... Image flatten: 79% complete... Image flatten: 81% complete... Image flatten: 82% complete... Image flatten: 84% complete... Image flatten: 85% complete... Image flatten: 87% complete... Image flatten: 89% complete... Image flatten: 90% complete... Image flatten: 92% complete... Image flatten: 93% complete... Image flatten: 95% complete... Image flatten: 96% complete... Image flatten: 98% complete... Image flatten: 100% complete...done. 2026-03-25T15:32:22.275 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap unprotect testimg1@snap1 2026-03-25T15:32:22.318 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-25T15:32:22.318 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:23.128 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:24.193 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:25.244 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:25.318 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:25.396 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:25.474 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:25.549 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:25.625 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:25.707 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:25.783 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:25.866 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:25.955 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:26.037 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:26.110 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:26.190 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:26.268 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:26.350 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:26.427 INFO:tasks.workunit.client.0.vm04.stderr:+ test_clone_v2 2026-03-25T15:32:26.427 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing clone v2...' 2026-03-25T15:32:26.427 INFO:tasks.workunit.client.0.vm04.stdout:testing clone v2... 2026-03-25T15:32:26.427 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-25T15:32:26.427 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:26.502 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:26.578 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:26.652 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:26.733 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:26.816 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:26.898 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:26.990 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:27.072 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:27.349 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:27.424 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:27.500 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:27.577 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:27.656 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:27.737 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:27.816 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:27.907 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:27.994 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:28.080 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 -s 1 test1 2026-03-25T15:32:28.124 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create test1@1 2026-03-25T15:32:28.748 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-25T15:32:28.755 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format=1 test1@1 test2 2026-03-25T15:32:28.787 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:32:28.786+0000 7fed0cbe1640 -1 librbd::image::CloneRequest: 0x562106e45dd0 validate_parent: parent snapshot must be protected 2026-03-25T15:32:28.787 INFO:tasks.workunit.client.0.vm04.stderr:rbd: clone error: (22) Invalid argument 2026-03-25T15:32:28.791 INFO:tasks.workunit.client.0.vm04.stderr:+ true 2026-03-25T15:32:28.791 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format=2 test1@1 test2 2026-03-25T15:32:28.847 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd snap ls test1 --format json 2026-03-25T15:32:28.847 INFO:tasks.workunit.client.0.vm04.stderr:++ jq '.[] | select(.name == "1") | .id' 2026-03-25T15:32:28.883 INFO:tasks.workunit.client.0.vm04.stderr:+ SNAP_ID=43 2026-03-25T15:32:28.883 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format=2 --snap-id 43 test1 test3 2026-03-25T15:32:28.940 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap protect test1@1 2026-03-25T15:32:28.979 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format=1 test1@1 test4 2026-03-25T15:32:29.043 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd children test1@1 2026-03-25T15:32:29.044 INFO:tasks.workunit.client.0.vm04.stderr:+ sort 2026-03-25T15:32:29.044 INFO:tasks.workunit.client.0.vm04.stderr:+ tr '\n' ' ' 2026-03-25T15:32:29.044 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -E 'test2.*test3.*test4' 2026-03-25T15:32:29.081 INFO:tasks.workunit.client.0.vm04.stdout:rbd/test2 rbd/test3 rbd/test4 2026-03-25T15:32:29.082 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd children --descendants test1 2026-03-25T15:32:29.082 INFO:tasks.workunit.client.0.vm04.stderr:+ sort 2026-03-25T15:32:29.082 INFO:tasks.workunit.client.0.vm04.stderr:+ tr '\n' ' ' 2026-03-25T15:32:29.082 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -E 'test2.*test3.*test4' 2026-03-25T15:32:29.131 INFO:tasks.workunit.client.0.vm04.stdout:rbd/test2 rbd/test3 rbd/test4 2026-03-25T15:32:29.131 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd remove test4 2026-03-25T15:32:29.230 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:32:29.229+0000 7fd9743eb640 0 -- 192.168.123.104:0/3009578374 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7fd95405f110 msgr2=0x7fd95407f510 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:32:29.238 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:32:29.242 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap unprotect test1@1 2026-03-25T15:32:29.284 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap remove test1@1 2026-03-25T15:32:29.325 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-25T15:32:29.334 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap list --all test1 2026-03-25T15:32:29.335 INFO:tasks.workunit.client.0.vm04.stderr:+ grep -E 'trash \(user 1\) *$' 2026-03-25T15:32:29.372 INFO:tasks.workunit.client.0.vm04.stdout: 43 af1f0613-13cd-48fd-9055-4f094e635975 1 MiB Wed Mar 25 15:32:28 2026 trash (user 1) 2026-03-25T15:32:29.373 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create test1@2 2026-03-25T15:32:29.750 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-25T15:32:29.757 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test1 2026-03-25T15:32:29.758 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'image has snapshots' 2026-03-25T15:32:29.806 INFO:tasks.workunit.client.0.vm04.stdout:rbd: image has snapshots - these must be deleted with 'rbd snap purge' before the image can be removed. 2026-03-25T15:32:29.806 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm test1@2 2026-03-25T15:32:30.751 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-25T15:32:30.763 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test1 2026-03-25T15:32:30.763 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'linked clones' 2026-03-25T15:32:30.818 INFO:tasks.workunit.client.0.vm04.stdout:rbd: image has snapshots with linked clones - these must be deleted or flattened before the image can be removed. 2026-03-25T15:32:30.818 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test3 2026-03-25T15:32:30.904 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:32:30.903+0000 7f5b26c6e640 0 -- 192.168.123.104:0/538014907 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x55f9cfaf9c30 msgr2=0x55f9cfada840 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:32:30.916 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:32:30.920 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test1 2026-03-25T15:32:30.920 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'linked clones' 2026-03-25T15:32:30.967 INFO:tasks.workunit.client.0.vm04.stdout:rbd: image has snapshots with linked clones - these must be deleted or flattened before the image can be removed. 2026-03-25T15:32:30.967 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd flatten test2 2026-03-25T15:32:31.749 INFO:tasks.workunit.client.0.vm04.stderr: Image flatten: 100% complete...done. 2026-03-25T15:32:31.758 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap list --all test1 2026-03-25T15:32:31.758 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:32:31.758 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^0$' 2026-03-25T15:32:31.791 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-25T15:32:31.791 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test1 2026-03-25T15:32:32.078 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:32:32.077+0000 7f0d94b45640 0 -- 192.168.123.104:0/2742344780 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x558c46071c30 msgr2=0x558c460c8de0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:32:32.081 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:32:32.085 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test2 2026-03-25T15:32:32.157 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:32:32.157+0000 7fa5d7838640 0 -- 192.168.123.104:0/288398836 >> [v2:192.168.123.104:6816/2768043570,v1:192.168.123.104:6817/2768043570] conn(0x55ec70f30410 msgr2=0x55ec70f50890 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:32:32.163 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:32:32.166 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 -s 1 test1 2026-03-25T15:32:32.208 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create test1@1 2026-03-25T15:32:32.760 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-25T15:32:32.767 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create test1@2 2026-03-25T15:32:33.764 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-25T15:32:33.772 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone test1@1 test2 --rbd-default-clone-format 2 2026-03-25T15:32:33.827 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone test1@2 test3 --rbd-default-clone-format 2 2026-03-25T15:32:33.884 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm test1@1 2026-03-25T15:32:33.922 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-25T15:32:33.930 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm test1@2 2026-03-25T15:32:33.971 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-25T15:32:33.979 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd rm test1 2026-03-25T15:32:33.979 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test1 2026-03-25T15:32:34.019 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:32:34.018+0000 7fba1989e300 -1 librbd::api::Image: remove: image has snapshots - not removing 2026-03-25T15:32:34.019 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 0% complete...failed. 2026-03-25T15:32:34.023 INFO:tasks.workunit.client.0.vm04.stderr:rbd: image has snapshots with linked clones - these must be deleted or flattened before the image can be removed. 2026-03-25T15:32:34.026 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:32:34.027 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test1 --rbd-move-parent-to-trash-on-remove=true 2026-03-25T15:32:34.087 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:32:34.090 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls -a 2026-03-25T15:32:34.090 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test1 2026-03-25T15:32:34.116 INFO:tasks.workunit.client.0.vm04.stdout:279e29ae9ab2 test1 2026-03-25T15:32:34.116 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test2 2026-03-25T15:32:34.182 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:32:34.182+0000 7fc8d6f8f640 0 -- 192.168.123.104:0/1220386281 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x555e22406cc0 msgr2=0x555e223e2910 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:32:34.191 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:32:34.191+0000 7fc8d6f8f640 0 -- 192.168.123.104:0/1220386281 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7fc8b405f100 msgr2=0x7fc8b407f500 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:32:34.780 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:32:34.784 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls -a 2026-03-25T15:32:34.784 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test1 2026-03-25T15:32:34.811 INFO:tasks.workunit.client.0.vm04.stdout:279e29ae9ab2 test1 2026-03-25T15:32:34.812 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test3 2026-03-25T15:32:34.888 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:32:34.887+0000 7f22da26f640 0 -- 192.168.123.104:0/926107231 >> [v2:192.168.123.104:6816/2768043570,v1:192.168.123.104:6817/2768043570] conn(0x55f26d82f1f0 msgr2=0x55f26d84f670 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:32:34.897 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:32:34.896+0000 7f22d9a6e640 0 -- 192.168.123.104:0/926107231 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x7f22b000d240 msgr2=0x7f22b000d6b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-25T15:32:35.791 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:32:35.795 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls -a 2026-03-25T15:32:35.795 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail grep test1 2026-03-25T15:32:35.795 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test1 2026-03-25T15:32:35.822 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:32:35.823 INFO:tasks.workunit.client.0.vm04.stderr:+ test_thick_provision 2026-03-25T15:32:35.823 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing thick provision...' 2026-03-25T15:32:35.823 INFO:tasks.workunit.client.0.vm04.stdout:testing thick provision... 2026-03-25T15:32:35.823 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-25T15:32:35.823 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:35.972 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:36.056 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:36.131 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:36.206 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:36.279 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:36.355 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:36.429 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:36.503 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:36.578 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:36.657 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:36.740 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:36.820 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:36.898 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:36.980 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:37.063 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:37.142 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:37.242 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:37.329 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --thick-provision -s 64M test1 2026-03-25T15:32:37.625 INFO:tasks.workunit.client.0.vm04.stderr: Thick provisioning: 6% complete... Thick provisioning: 12% complete... Thick provisioning: 18% complete... Thick provisioning: 25% complete... Thick provisioning: 31% complete... Thick provisioning: 37% complete... Thick provisioning: 43% complete... Thick provisioning: 50% complete... Thick provisioning: 56% complete... Thick provisioning: 62% complete... Thick provisioning: 68% complete... Thick provisioning: 75% complete... Thick provisioning: 81% complete... Thick provisioning: 87% complete... Thick provisioning: 93% complete... Thick provisioning: 100% complete... Thick provisioning: 100% complete...done. 2026-03-25T15:32:37.639 INFO:tasks.workunit.client.0.vm04.stderr:+ count=0 2026-03-25T15:32:37.639 INFO:tasks.workunit.client.0.vm04.stderr:+ ret= 2026-03-25T15:32:37.639 INFO:tasks.workunit.client.0.vm04.stderr:+ '[' 0 -lt 10 ']' 2026-03-25T15:32:37.639 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd du 2026-03-25T15:32:37.639 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test1 2026-03-25T15:32:37.639 INFO:tasks.workunit.client.0.vm04.stderr:+ tr -s ' ' 2026-03-25T15:32:37.639 INFO:tasks.workunit.client.0.vm04.stderr:+ cut -d ' ' -f 4-5 2026-03-25T15:32:37.639 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^64 MiB' 2026-03-25T15:32:37.679 INFO:tasks.workunit.client.0.vm04.stdout:64 MiB 2026-03-25T15:32:37.679 INFO:tasks.workunit.client.0.vm04.stderr:+ ret=0 2026-03-25T15:32:37.679 INFO:tasks.workunit.client.0.vm04.stderr:+ '[' 0 = 0 ']' 2026-03-25T15:32:37.679 INFO:tasks.workunit.client.0.vm04.stderr:+ break 2026-03-25T15:32:37.679 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd du 2026-03-25T15:32:37.713 INFO:tasks.workunit.client.0.vm04.stdout:NAME PROVISIONED USED 2026-03-25T15:32:37.713 INFO:tasks.workunit.client.0.vm04.stdout:test1 64 MiB 64 MiB 2026-03-25T15:32:37.717 INFO:tasks.workunit.client.0.vm04.stderr:+ '[' 0 '!=' 0 ']' 2026-03-25T15:32:37.717 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test1 2026-03-25T15:32:37.837 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 6% complete... Removing image: 12% complete... Removing image: 18% complete... Removing image: 25% complete... Removing image: 31% complete... Removing image: 37% complete... Removing image: 43% complete... Removing image: 50% complete... Removing image: 56% complete... Removing image: 62% complete... Removing image: 68% complete... Removing image: 75% complete... Removing image: 81% complete... Removing image: 87% complete... Removing image: 93% complete...2026-03-25T15:32:37.836+0000 7f8e21f2b640 0 -- 192.168.123.104:0/3148369064 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7f8e0005f180 msgr2=0x7f8e0007f580 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:32:37.863 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:32:37.863+0000 7f8e21f2b640 0 -- 192.168.123.104:0/3148369064 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x55e61fabccc0 msgr2=0x7f8e000a0800 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:32:37.870 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:32:37.875 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-25T15:32:37.875 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test1 2026-03-25T15:32:37.875 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:32:37.875 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^0$' 2026-03-25T15:32:37.903 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-25T15:32:37.903 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --thick-provision -s 4G test1 2026-03-25T15:32:38.544 INFO:tasks.workunit.client.0.vm04.stderr: Thick provisioning: 1% complete... Thick provisioning: 2% complete... Thick provisioning: 3% complete... Thick provisioning: 4% complete... Thick provisioning: 5% complete...2026-03-25T15:32:38.544+0000 7f2fe118f640 0 -- 192.168.123.104:0/902463921 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x564ba2d8eb50 msgr2=0x564ba2e8d8c0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:32:38.686 INFO:tasks.workunit.client.0.vm04.stderr: Thick provisioning: 6% complete...2026-03-25T15:32:38.685+0000 7f2fe118f640 0 -- 192.168.123.104:0/902463921 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7f2fc005ef80 msgr2=0x7f2fc007f380 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:32:54.237 INFO:tasks.workunit.client.0.vm04.stderr: Thick provisioning: 7% complete... Thick provisioning: 8% complete... Thick provisioning: 9% complete... Thick provisioning: 10% complete... Thick provisioning: 11% complete... Thick provisioning: 12% complete... Thick provisioning: 13% complete... Thick provisioning: 14% complete... Thick provisioning: 15% complete... Thick provisioning: 16% complete... Thick provisioning: 17% complete... Thick provisioning: 18% complete... Thick provisioning: 19% complete... Thick provisioning: 20% complete... Thick provisioning: 21% complete... Thick provisioning: 22% complete... Thick provisioning: 23% complete... Thick provisioning: 24% complete... Thick provisioning: 25% complete... Thick provisioning: 26% complete... Thick provisioning: 27% complete... Thick provisioning: 28% complete... Thick provisioning: 29% complete... Thick provisioning: 30% complete... Thick provisioning: 31% complete... Thick provisioning: 32% complete... Thick provisioning: 33% complete... Thick provisioning: 34% complete... Thick provisioning: 35% complete... Thick provisioning: 36% complete... Thick provisioning: 37% complete... Thick provisioning: 38% complete... Thick provisioning: 39% complete... Thick provisioning: 40% complete... Thick provisioning: 41% complete... Thick provisioning: 42% complete... Thick provisioning: 43% complete... Thick provisioning: 44% complete... Thick provisioning: 45% complete... Thick provisioning: 46% complete... Thick provisioning: 47% complete... Thick provisioning: 48% complete... Thick provisioning: 49% complete... Thick provisioning: 50% complete... Thick provisioning: 51% complete... Thick provisioning: 52% complete... Thick provisioning: 53% complete... Thick provisioning: 54% complete... Thick provisioning: 55% complete... Thick provisioning: 56% complete... Thick provisioning: 57% complete... Thick provisioning: 58% complete... Thick provisioning: 59% complete... Thick provisioning: 60% complete... Thick provisioning: 61% complete... Thick provisioning: 62% complete... Thick provisioning: 63% complete... Thick provisioning: 64% complete... Thick provisioning: 65% complete... Thick provisioning: 66% complete... Thick provisioning: 67% complete... Thick provisioning: 68% complete... Thick provisioning: 69% complete... Thick provisioning: 70% complete... Thick provisioning: 71% complete... Thick provisioning: 72% complete... Thick provisioning: 73% complete... Thick provisioning: 74% complete... Thick provisioning: 75% complete... Thick provisioning: 76% complete... Thick provisioning: 77% complete... Thick provisioning: 78% complete... Thick provisioning: 79% complete... Thick provisioning: 80% complete... Thick provisioning: 81% complete... Thick provisioning: 82% complete... Thick provisioning: 83% complete... Thick provisioning: 84% complete... Thick provisioning: 85% complete... Thick provisioning: 86% complete... Thick provisioning: 87% complete... Thick provisioning: 88% complete... Thick provisioning: 89% complete... Thick provisioning: 90% complete... Thick provisioning: 91% complete... Thick provisioning: 92% complete... Thick provisioning: 93% complete... Thick provisioning: 94% complete... Thick provisioning: 95% complete... Thick provisioning: 96% complete... Thick provisioning: 97% complete... Thick provisioning: 98% complete... Thick provisioning: 99% complete... Thick provisioning: 100% complete... Thick provisioning: 100% complete...done. 2026-03-25T15:32:54.249 INFO:tasks.workunit.client.0.vm04.stderr:+ count=0 2026-03-25T15:32:54.249 INFO:tasks.workunit.client.0.vm04.stderr:+ ret= 2026-03-25T15:32:54.249 INFO:tasks.workunit.client.0.vm04.stderr:+ '[' 0 -lt 10 ']' 2026-03-25T15:32:54.249 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd du 2026-03-25T15:32:54.249 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test1 2026-03-25T15:32:54.249 INFO:tasks.workunit.client.0.vm04.stderr:+ tr -s ' ' 2026-03-25T15:32:54.249 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^4 GiB' 2026-03-25T15:32:54.249 INFO:tasks.workunit.client.0.vm04.stderr:+ cut -d ' ' -f 4-5 2026-03-25T15:32:54.284 INFO:tasks.workunit.client.0.vm04.stdout:4 GiB 2026-03-25T15:32:54.284 INFO:tasks.workunit.client.0.vm04.stderr:+ ret=0 2026-03-25T15:32:54.284 INFO:tasks.workunit.client.0.vm04.stderr:+ '[' 0 = 0 ']' 2026-03-25T15:32:54.284 INFO:tasks.workunit.client.0.vm04.stderr:+ break 2026-03-25T15:32:54.284 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd du 2026-03-25T15:32:54.316 INFO:tasks.workunit.client.0.vm04.stdout:NAME PROVISIONED USED 2026-03-25T15:32:54.316 INFO:tasks.workunit.client.0.vm04.stdout:test1 4 GiB 4 GiB 2026-03-25T15:32:54.319 INFO:tasks.workunit.client.0.vm04.stderr:+ '[' 0 '!=' 0 ']' 2026-03-25T15:32:54.320 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm test1 2026-03-25T15:32:54.433 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 1% complete...2026-03-25T15:32:54.432+0000 7f4238ac7640 0 -- 192.168.123.104:0/2719622323 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x561a491b9cc0 msgr2=0x561a4920b6d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:32:54.454 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 2% complete... Removing image: 3% complete...2026-03-25T15:32:54.454+0000 7f423703d640 0 -- 192.168.123.104:0/2719622323 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x7f421000a3f0 msgr2=0x7f42100050e0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:32:55.657 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 4% complete... Removing image: 5% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 8% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 11% complete... Removing image: 12% complete... Removing image: 13% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 16% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 19% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 22% complete... Removing image: 23% complete... Removing image: 24% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 27% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 30% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 33% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 36% complete... Removing image: 37% complete... Removing image: 38% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 41% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 44% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 47% complete... Removing image: 48% complete... Removing image: 49% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 52% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 55% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 58% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 61% complete... Removing image: 62% complete... Removing image: 63% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 66% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 69% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 72% complete... Removing image: 73% complete... Removing image: 74% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 77% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 80% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 83% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 86% complete... Removing image: 87% complete... Removing image: 88% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 91% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 94% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 97% complete... Removing image: 98% complete... Removing image: 99% complete... Removing image: 100% complete...done. 2026-03-25T15:32:55.661 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd ls 2026-03-25T15:32:55.661 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test1 2026-03-25T15:32:55.661 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:32:55.661 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^0$' 2026-03-25T15:32:55.690 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-25T15:32:55.691 INFO:tasks.workunit.client.0.vm04.stderr:+ test_namespace 2026-03-25T15:32:55.691 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing namespace...' 2026-03-25T15:32:55.691 INFO:tasks.workunit.client.0.vm04.stdout:testing namespace... 2026-03-25T15:32:55.691 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-25T15:32:55.691 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:55.776 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:55.861 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:55.941 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:56.028 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:56.124 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:56.221 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:56.306 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:56.394 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:56.485 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:56.580 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:56.673 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:56.764 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:56.856 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:56.956 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:57.051 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:57.140 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:57.242 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:32:57.336 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd namespace ls 2026-03-25T15:32:57.336 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:32:57.336 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^0$' 2026-03-25T15:32:57.371 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-25T15:32:57.371 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd namespace create rbd/test1 2026-03-25T15:32:57.412 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd namespace create --pool rbd --namespace test2 2026-03-25T15:32:57.454 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd namespace create --namespace test3 2026-03-25T15:32:57.491 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd namespace create rbd/test3 2026-03-25T15:32:57.491 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd namespace create rbd/test3 2026-03-25T15:32:57.523 INFO:tasks.workunit.client.0.vm04.stderr:rbd: failed to created namespace: (17) File exists 2026-03-25T15:32:57.524 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:32:57.523+0000 7fbb5da20300 -1 librbd::api::Namespace: create: failed to add namespace: (17) File exists 2026-03-25T15:32:57.526 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:32:57.527 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd namespace list 2026-03-25T15:32:57.527 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test 2026-03-25T15:32:57.527 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:32:57.527 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^3$' 2026-03-25T15:32:57.556 INFO:tasks.workunit.client.0.vm04.stdout:3 2026-03-25T15:32:57.557 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd namespace remove --pool rbd missing 2026-03-25T15:32:57.557 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd namespace remove --pool rbd missing 2026-03-25T15:32:57.578 INFO:tasks.workunit.client.0.vm04.stderr:rbd: namespace name was not specified 2026-03-25T15:32:57.580 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:32:57.580 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 1G rbd/test1/image1 2026-03-25T15:32:57.628 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-pattern rand --io-total 32M --io-size 4K rbd/test1/image1 2026-03-25T15:32:57.672 INFO:tasks.workunit.client.0.vm04.stdout:bench type write io_size 4096 io_threads 16 bytes 33554432 pattern random 2026-03-25T15:32:57.779 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:32:57.778+0000 7fb557fff640 0 -- 192.168.123.104:0/902332419 >> [v2:192.168.123.104:6816/2768043570,v1:192.168.123.104:6817/2768043570] conn(0x7fb538004a30 msgr2=0x7fb538025430 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-25T15:32:57.839 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:32:57.839+0000 7fb55ffff640 0 -- 192.168.123.104:0/902332419 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x7fb544008d20 msgr2=0x7fb5440291a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:32:58.779 INFO:tasks.workunit.client.0.vm04.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-25T15:32:58.779 INFO:tasks.workunit.client.0.vm04.stdout: 1 2544 2314.64 9.0 MiB/s 2026-03-25T15:32:58.868 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:32:58.868+0000 7fb55ffff640 0 -- 192.168.123.104:0/902332419 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x7fb538004a30 msgr2=0x7fb54007fb10 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:32:59.104 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:32:59.103+0000 7fb55ffff640 0 -- 192.168.123.104:0/902332419 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x55acc8217fe0 msgr2=0x7fb5401c6c40 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:32:59.437 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:32:59.435+0000 7fb565809640 0 -- 192.168.123.104:0/902332419 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7fb544008d20 msgr2=0x7fb540080160 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:32:59.687 INFO:tasks.workunit.client.0.vm04.stdout: 2 4000 1994.04 7.8 MiB/s 2026-03-25T15:33:00.673 INFO:tasks.workunit.client.0.vm04.stdout: 3 5984 2000.66 7.8 MiB/s 2026-03-25T15:33:00.783 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:33:00.782+0000 7fb55ffff640 0 -- 192.168.123.104:0/902332419 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x55acc8217fe0 msgr2=0x7fb5400806a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-25T15:33:01.222 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:33:01.221+0000 7fb557fff640 0 -- 192.168.123.104:0/902332419 >> [v2:192.168.123.104:6816/2768043570,v1:192.168.123.104:6817/2768043570] conn(0x7fb54005f0a0 msgr2=0x7fb54007f4a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:33:02.259 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:33:02.257+0000 7fb565809640 0 -- 192.168.123.104:0/902332419 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x55acc8217fe0 msgr2=0x7fb540080680 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:33:02.785 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:33:02.783+0000 7fb565809640 0 -- 192.168.123.104:0/902332419 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x55acc8217fe0 msgr2=0x7fb54016a280 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:33:03.044 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:33:03.042+0000 7fb565809640 0 -- 192.168.123.104:0/902332419 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7fb54005f0a0 msgr2=0x7fb54007f4a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:33:03.147 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:33:03.145+0000 7fb565809640 0 -- 192.168.123.104:0/902332419 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7fb54005f0a0 msgr2=0x7fb54007f4a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:33:03.184 INFO:tasks.workunit.client.0.vm04.stdout: 5 7056 1283.72 5.0 MiB/s 2026-03-25T15:33:03.977 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:33:03.975+0000 7fb565809640 0 -- 192.168.123.104:0/902332419 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7fb54005f0a0 msgr2=0x7fb54007f4a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:33:04.609 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:33:04.607+0000 7fb565809640 0 -- 192.168.123.104:0/902332419 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x55acc8217fe0 msgr2=0x7fb54016a280 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:33:05.342 INFO:tasks.workunit.client.0.vm04.stdout:elapsed: 7 ops: 8192 ops/sec: 1068.33 bytes/sec: 4.2 MiB/s 2026-03-25T15:33:05.354 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create rbd/test1/image1@1 2026-03-25T15:33:05.958 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-25T15:33:05.966 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format 2 rbd/test1/image1@1 rbd/test2/image1 2026-03-25T15:33:06.034 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm rbd/test1/image1@1 2026-03-25T15:33:06.080 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-25T15:33:06.098 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd export rbd/test1/image1 - 2026-03-25T15:33:06.098 INFO:tasks.workunit.client.0.vm04.stderr:+ cmp /dev/fd/63 /dev/fd/62 2026-03-25T15:33:06.098 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd export rbd/test2/image1 - 2026-03-25T15:33:06.341 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 1% complete... Exporting image: 2% complete... Exporting image: 3% complete... Exporting image: 1% complete... Exporting image: 2% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 5% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 4% complete... Exporting image: 5% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 8% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 8% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 11% complete... Exporting image: 11% complete... Exporting image: 12% complete... Exporting image: 13% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 12% complete... Exporting image: 13% complete... Exporting image: 14% complete... Exporting image: 15% complete...2026-03-25T15:33:06.338+0000 7f84d609a640 0 -- 192.168.123.104:0/803445199 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x55e66f2b97a0 msgr2=0x55e66f2d9b80 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:33:06.389 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 16% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 19% complete... Exporting image: 16% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 19% complete...2026-03-25T15:33:06.387+0000 7f84d609a640 0 -- 192.168.123.104:0/803445199 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7f84b405f030 msgr2=0x7f84b407f430 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:33:06.563 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 22% complete... Exporting image: 23% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 22% complete... Exporting image: 23% complete... Exporting image: 24% complete... Exporting image: 25% complete... Exporting image: 26% complete...Exporting image: 24% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 27% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 30% complete... Exporting image: 27% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 30% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 33% complete... Exporting image: 34% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 33% complete... Exporting image: 34% complete...2026-03-25T15:33:06.560+0000 7efe8ca9a640 0 -- 192.168.123.104:0/2553725139 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x55e261dfdb60 msgr2=0x55e261e4ff10 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:33:06.657 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 35% complete... Exporting image: 36% complete... Exporting image: 37% complete... Exporting image: 38% complete... Exporting image: 35% complete... Exporting image: 36% complete... Exporting image: 37% complete... Exporting image: 38% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 41% complete... Exporting image: 42% complete...Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 41% complete... Exporting image: 42% complete...2026-03-25T15:33:06.654+0000 7efe8ca9a640 0 -- 192.168.123.104:0/2553725139 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7efe6c05f080 msgr2=0x7efe6c07f480 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:33:07.582 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 43% complete... Exporting image: 44% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 43% complete... Exporting image: 44% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 47% complete... Exporting image: 47% complete... Exporting image: 48% complete... Exporting image: 49% complete... Exporting image: 50% complete... Exporting image: 48% complete... Exporting image: 49% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 52% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 51% complete... Exporting image: 52% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 55% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 55% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 58% complete... Exporting image: 58% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 61% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 61% complete... Exporting image: 62% complete... Exporting image: 62% complete... Exporting image: 63% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 63% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 66% complete... Exporting image: 66% complete... Exporting image: 67% complete...Exporting image: 67% complete... Exporting image: 68% complete...Exporting image Exporting image: 69% complete...: 68% complete... Exporting image: 69% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 72% complete... Exporting image: 73% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 72% complete... Exporting image: 73% complete... Exporting image: 74% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 74% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 77% complete... Exporting image: 78% complete... Exporting image Exporting image: 79% complete... Exporting image: 80% complete...: 77% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 80% complete... Exporting image: 81 % complete...Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 83% complete... Exporting image: 82% complete... Exporting image: 83% complete... Exporting image: 84% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 86% complete... Exporting image: 87% complete... Exporting image: 85% complete... Exporting image: 86% complete... Exporting image: 87% complete... Exporting image: 88% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 91% complete... Exporting image: 88% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 91% complete... Exporting image: 92% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 94% complete... Exporting image: 95% complete... Exporting image: 93% complete... Exporting image: 94% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 97% complete... Exporting image: 98% complete... Exporting image: 99% complete... Exporting image: 96% complete... Exporting image: 97% complete... Exporting image: 98% complete... Exporting image: 99% complete... Exporting image: 100% complete...done. 2026-03-25T15:33:07.582 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 100% complete...done. 2026-03-25T15:33:07.595 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm rbd/test2/image1 2026-03-25T15:33:07.706 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 1% complete... Removing image: 2% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 5% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 8% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 11% complete... Removing image: 12% complete... Removing image: 13% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 16% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 19% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 22% complete... Removing image: 23% complete... Removing image: 24% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 27% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 30% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 33% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 36% complete... Removing image: 37% complete... Removing image: 38% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 41% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 44% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 47% complete... Removing image: 48% complete... Removing image: 49% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 52% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 55% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 58% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 61% complete... Removing image: 62% complete... Removing image: 63% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 66% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 69% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 72% complete... Removing image: 73% complete... Removing image: 74% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 77% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 80% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 83% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 86% complete... Removing image: 87% complete... Removing image: 88% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 91% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 94% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 97% complete... Removing image: 98% complete... Removing image: 99% complete...2026-03-25T15:33:07.705+0000 7f7ff9d29640 0 -- 192.168.123.104:0/1197751238 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x55d07e694910 msgr2=0x55d07e799ba0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-25T15:33:07.722 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:33:07.721+0000 7f7ff3fff640 0 -- 192.168.123.104:0/1197751238 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7f7fd0008d20 msgr2=0x7f7fd00291a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:33:08.466 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:33:08.469 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 1G rbd/image2 2026-03-25T15:33:08.513 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-pattern rand --io-total 32M --io-size 4K rbd/image2 2026-03-25T15:33:08.556 INFO:tasks.workunit.client.0.vm04.stdout:bench type write io_size 4096 io_threads 16 bytes 33554432 pattern random 2026-03-25T15:33:08.763 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:33:08.761+0000 7fb21c34d640 0 -- 192.168.123.104:0/1427071167 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x5575f764d850 msgr2=0x5575f76da720 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-25T15:33:10.353 INFO:tasks.workunit.client.0.vm04.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-25T15:33:10.353 INFO:tasks.workunit.client.0.vm04.stdout: 1 4000 2243.57 8.8 MiB/s 2026-03-25T15:33:11.053 INFO:tasks.workunit.client.0.vm04.stdout: 2 4016 1619.93 6.3 MiB/s 2026-03-25T15:33:11.260 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:33:11.259+0000 7fb21c34d640 0 -- 192.168.123.104:0/1427071167 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7fb1fc05f100 msgr2=0x7fb1fc157f70 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:33:11.304 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:33:11.302+0000 7fb21c34d640 0 -- 192.168.123.104:0/1427071167 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7fb1fc05f100 msgr2=0x7fb1fc157f70 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:33:11.661 INFO:tasks.workunit.client.0.vm04.stdout: 3 4048 1311.81 5.1 MiB/s 2026-03-25T15:33:12.275 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:33:12.273+0000 7fb21a8c3640 0 -- 192.168.123.104:0/1427071167 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x7fb1f802b780 msgr2=0x7fb1fc07f4e0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:33:12.432 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:33:12.430+0000 7fb21b0c4640 0 -- 192.168.123.104:0/1427071167 >> [v2:192.168.123.104:6816/2768043570,v1:192.168.123.104:6817/2768043570] conn(0x7fb1f80047a0 msgr2=0x7fb1f80250f0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-25T15:33:12.575 INFO:tasks.workunit.client.0.vm04.stdout: 4 5008 1252.55 4.9 MiB/s 2026-03-25T15:33:12.620 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:33:12.617+0000 7fb21b0c4640 0 -- 192.168.123.104:0/1427071167 >> [v2:192.168.123.104:6816/2768043570,v1:192.168.123.104:6817/2768043570] conn(0x7fb1f802b780 msgr2=0x7fb1fc07f4e0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:33:13.147 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:33:13.145+0000 7fb21c34d640 0 -- 192.168.123.104:0/1427071167 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x5575f764d850 msgr2=0x7fb1fc1584b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:33:13.150 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:33:13.149+0000 7fb21c34d640 0 -- 192.168.123.104:0/1427071167 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7fb1f802b780 msgr2=0x7fb1fc07f4e0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-25T15:33:13.152 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:33:13.151+0000 7fb21c34d640 0 -- 192.168.123.104:0/1427071167 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7fb1fc081c70 msgr2=0x7fb1fc158490 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-25T15:33:13.173 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:33:13.172+0000 7fb21c34d640 0 -- 192.168.123.104:0/1427071167 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7fb1fc11fa70 msgr2=0x7fb1fc07f4e0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-25T15:33:13.657 INFO:tasks.workunit.client.0.vm04.stdout: 5 6448 1268.94 5.0 MiB/s 2026-03-25T15:33:14.095 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:33:14.093+0000 7fb21b0c4640 0 -- 192.168.123.104:0/1427071167 >> [v2:192.168.123.104:6816/2768043570,v1:192.168.123.104:6817/2768043570] conn(0x7fb1fc0aa6c0 msgr2=0x7fb1fc07f4e0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-25T15:33:14.103 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:33:14.102+0000 7fb21c34d640 0 -- 192.168.123.104:0/1427071167 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7fb1fc158490 msgr2=0x7fb1fc0abed0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:33:14.105 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:33:14.103+0000 7fb21c34d640 0 -- 192.168.123.104:0/1427071167 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7fb1fc11efa0 msgr2=0x7fb1fc07f4e0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-25T15:33:15.683 INFO:tasks.workunit.client.0.vm04.stdout: 7 7536 663.289 2.6 MiB/s 2026-03-25T15:33:17.833 INFO:tasks.workunit.client.0.vm04.stdout:elapsed: 9 ops: 8192 ops/sec: 883.138 bytes/sec: 3.4 MiB/s 2026-03-25T15:33:17.847 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create rbd/image2@1 2026-03-25T15:33:18.006 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-25T15:33:18.017 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format 2 rbd/image2@1 rbd/test2/image2 2026-03-25T15:33:18.087 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm rbd/image2@1 2026-03-25T15:33:18.130 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-25T15:33:18.145 INFO:tasks.workunit.client.0.vm04.stderr:+ cmp /dev/fd/63 /dev/fd/62 2026-03-25T15:33:18.145 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd export rbd/image2 - 2026-03-25T15:33:18.145 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd export rbd/test2/image2 - 2026-03-25T15:33:18.522 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 1% complete... Exporting image: 2% complete... Exporting image: 3% complete... Exporting image: 1% complete... Exporting image: 2% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 5% complete... Exporting image: 6% complete... Exporting image: 7% complete...Exporting image: 4% complete... Exporting image: 5% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 8% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 11% complete... Exporting image: 8% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 11% complete... Exporting image: 12% complete... Exporting image: 13% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 12% complete... Exporting image: 13% complete... Exporting image: 14% complete... Exporting image: 15% complete...2026-03-25T15:33:18.520+0000 7f49890a4640 0 -- 192.168.123.104:0/834382604 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7f496c04f870 msgr2=0x7f496c06fc50 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:33:18.564 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 16% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 19% complete...Exporting image: 16% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 19% complete...2026-03-25T15:33:18.563+0000 7f49890a4640 0 -- 192.168.123.104:0/834382604 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x559c05a34b60 msgr2=0x7f496807fb80 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-25T15:33:18.825 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 22% complete... Exporting image: 23% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 22% complete... Exporting image: 23% complete... Exporting image: 24% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 24% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 27% complete... Exporting image: 27% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 30% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 30% complete... Exporting image: 31% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 33% complete... Exporting image: 34% complete...2026-03-25T15:33:18.822+0000 7faaf48bb640 0 -- 192.168.123.104:0/3937613429 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x7faad00040d0 msgr2=0x7faad0001270 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-25T15:33:18.906 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 35% complete... Exporting image: 32% complete... Exporting image: 33% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 36% complete... Exporting image: 37% complete... Exporting image: 38% complete... Exporting image: 39% complete...2026-03-25T15:33:18.904+0000 7faaf6345640 0 -- 192.168.123.104:0/3937613429 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x564cd6988c70 msgr2=0x564cd69a9050 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:33:19.797 INFO:tasks.workunit.client.0.vm04.stderr: Exporting image: 36% complete... Exporting image: 37% complete... Exporting image: 38% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 41% complete... Exporting image: 42% complete... Exporting image: 40% complete... Exporting image: 41% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 44% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 43% complete... Exporting image: 44% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 47% complete... Exporting image: 48% complete... Exporting image: 49% complete... Exporting image: 50% complete... Exporting image: 47% complete... Exporting image: 48% complete... Exporting image: 49% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 52% complete... Exporting image: 51% complete... Exporting image: 52% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 55% complete... Exporting image: 56% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 55% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 57% complete... Exporting image: 58% complete... Exporting image: 59% complete... Exporting image: 60% complete...Exporting image: 58% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 61% complete... Exporting image: 62% complete... Exporting image: 61% complete... Exporting image: 62% complete... Exporting image: 63% complete... Exporting image: 64% complete... Exporting image: 63% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 66% complete... Exporting image: 65% complete... Exporting image: 66% complete... Exporting image: 67% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 69% complete... Exporting image: 70% complete... Exporting image: 68% complete... Exporting image: 69% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 71% complete... Exporting image: 72% complete... Exporting image: 73% complete... Exporting image: 74% complete... Exporting image: 72% complete... Exporting image: 73% complete... Exporting image: 74% complete... Exporting image: 75% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 77% complete... Exporting image: 78% complete... Exporting image: 76% complete... Exporting image: 77% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 79% complete... Exporting image: 80% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 80% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 83% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 83% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 86% complete... Exporting image: 87% complete... Exporting image: 88% complete... Exporting image: 89% complete... Exporting image: 86% complete... Exporting image: 87% complete... Exporting image: 88% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 91% complete... Exporting image: 92% complete... Exporting image: 90% complete... Exporting image: 91% complete... Exporting image: 92% complete... Exporting image: 93 % complete...Exporting image: 93Exporting image% complete...: 94% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 94% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting imageExporting image: : 9797% complete...% complete... Exporting image: 98% complete... Exporting image: 99% complete... Exporting image: 98% complete... Exporting image: 99% complete... Exporting image: 100% complete...done.Exporting image: 100% complete...done. 2026-03-25T15:33:19.797 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-25T15:33:19.805 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd rm rbd/image2 2026-03-25T15:33:19.805 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm rbd/image2 2026-03-25T15:33:19.859 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:33:19.858+0000 7f014afd4300 -1 librbd::api::Image: remove: image has snapshots - not removing 2026-03-25T15:33:19.859 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 0% complete...failed. 2026-03-25T15:33:19.863 INFO:tasks.workunit.client.0.vm04.stderr:rbd: image has snapshots with linked clones - these must be deleted or flattened before the image can be removed. 2026-03-25T15:33:19.867 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:33:19.867 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm rbd/test2/image2 2026-03-25T15:33:19.970 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 1% complete... Removing image: 2% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 5% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 8% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 11% complete... Removing image: 12% complete... Removing image: 13% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 16% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 19% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 22% complete... Removing image: 23% complete... Removing image: 24% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 27% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 30% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 33% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 36% complete... Removing image: 37% complete... Removing image: 38% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 41% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 44% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 47% complete... Removing image: 48% complete... Removing image: 49% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 52% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 55% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 58% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 61% complete... Removing image: 62% complete... Removing image: 63% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 66% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 69% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 72% complete... Removing image: 73% complete... Removing image: 74% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 77% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 80% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 83% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 86% complete... Removing image: 87% complete... Removing image: 88% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 91% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 94% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 97% complete... Removing image: 98% complete... Removing image: 99% complete...2026-03-25T15:33:19.969+0000 7f331b7fe640 0 -- 192.168.123.104:0/4039967513 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7f32fc009b60 msgr2=0x7f32fc009fb0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:33:19.984 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:33:19.983+0000 7f331b7fe640 0 -- 192.168.123.104:0/4039967513 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7f330005f080 msgr2=0x7f330007f480 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:33:20.984 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:33:20.988 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm rbd/image2 2026-03-25T15:33:21.129 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 1% complete... Removing image: 2% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 5% complete... Removing image: 6% complete... Removing image: 7% complete...2026-03-25T15:33:21.128+0000 7fa31186c640 0 -- 192.168.123.104:0/1755757896 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x55bc11ac4cc0 msgr2=0x55bc11aa0910 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:33:21.161 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 8% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 11% complete... Removing image: 12% complete... Removing image: 13% complete... Removing image: 14% complete...2026-03-25T15:33:21.160+0000 7fa31186c640 0 -- 192.168.123.104:0/1755757896 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7fa2f005f130 msgr2=0x7fa2f007f530 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:33:21.531 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 15% complete... Removing image: 16% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 19% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 22% complete... Removing image: 23% complete... Removing image: 24% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 27% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 30% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 33% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 36% complete... Removing image: 37% complete... Removing image: 38% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 41% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 44% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 47% complete... Removing image: 48% complete... Removing image: 49% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 52% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 55% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 58% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 61% complete... Removing image: 62% complete... Removing image: 63% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 66% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 69% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 72% complete... Removing image: 73% complete... Removing image: 74% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 77% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 80% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 83% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 86% complete... Removing image: 87% complete... Removing image: 88% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 91% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 94% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 97% complete... Removing image: 98% complete... Removing image: 99% complete... Removing image: 100% complete...done. 2026-03-25T15:33:21.535 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 1G rbd/test1/image3 2026-03-25T15:33:21.589 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create rbd/test1/image3@1 2026-03-25T15:33:21.968 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-25T15:33:21.983 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap protect rbd/test1/image3@1 2026-03-25T15:33:22.032 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone --rbd-default-clone-format 1 rbd/test1/image3@1 rbd/test1/image4 2026-03-25T15:33:22.109 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm rbd/test1/image4 2026-03-25T15:33:22.238 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 1% complete... Removing image: 2% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 5% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 8% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 11% complete... Removing image: 12% complete... Removing image: 13% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 16% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 19% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 22% complete... Removing image: 23% complete... Removing image: 24% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 27% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 30% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 33% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 36% complete... Removing image: 37% complete... Removing image: 38% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 41% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 44% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 47% complete... Removing image: 48% complete... Removing image: 49% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 52% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 55% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 58% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 61% complete... Removing image: 62% complete... Removing image: 63% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 66% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 69% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 72% complete... Removing image: 73% complete... Removing image: 74% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 77% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 80% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 83% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 86% complete... Removing image: 87% complete... Removing image: 88% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 91% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 94% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 97% complete... Removing image: 98% complete... Removing image: 99% complete...2026-03-25T15:33:22.237+0000 7f3fca664640 0 -- 192.168.123.104:0/631944939 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x558a80a9b540 msgr2=0x558a80a77a00 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-25T15:33:22.253 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:33:22.256 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap unprotect rbd/test1/image3@1 2026-03-25T15:33:22.302 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap rm rbd/test1/image3@1 2026-03-25T15:33:23.680 INFO:tasks.workunit.client.0.vm04.stderr: Removing snap: 100% complete...done. 2026-03-25T15:33:23.694 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm rbd/test1/image3 2026-03-25T15:33:23.814 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 1% complete... Removing image: 2% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 5% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 8% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 11% complete... Removing image: 12% complete... Removing image: 13% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 16% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 19% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 22% complete... Removing image: 23% complete... Removing image: 24% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 27% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 30% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 33% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 36% complete... Removing image: 37% complete... Removing image: 38% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 41% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 44% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 47% complete... Removing image: 48% complete... Removing image: 49% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 52% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 55% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 58% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 61% complete... Removing image: 62% complete... Removing image: 63% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 66% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 69% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 72% complete... Removing image: 73% complete... Removing image: 74% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 77% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 80% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 83% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 86% complete... Removing image: 87% complete... Removing image: 88% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 91% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 94% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 97% complete... Removing image: 98% complete... Removing image: 99% complete...2026-03-25T15:33:23.812+0000 7fcfe59c5640 0 -- 192.168.123.104:0/4201243193 >> [v2:192.168.123.104:6816/2768043570,v1:192.168.123.104:6817/2768043570] conn(0x557669a6fab0 msgr2=0x557669a8ff30 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:33:23.826 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:33:23.830 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 1G --namespace test1 image2 2026-03-25T15:33:23.881 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd namespace remove rbd/test1 2026-03-25T15:33:23.881 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd namespace remove rbd/test1 2026-03-25T15:33:23.917 INFO:tasks.workunit.client.0.vm04.stderr:rbd: namespace contains images which must be deleted first. 2026-03-25T15:33:23.921 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:33:23.921 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd group create rbd/test1/group1 2026-03-25T15:33:23.960 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd group image add rbd/test1/group1 rbd/test1/image1 2026-03-25T15:33:24.004 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd group image add --group-pool rbd --group-namespace test1 --group group1 --image-pool rbd --image-namespace test1 --image image2 2026-03-25T15:33:24.061 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd group image rm --group-pool rbd --group-namespace test1 --group group1 --image-pool rbd --image-namespace test1 --image image1 2026-03-25T15:33:24.113 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd group image rm rbd/test1/group1 rbd/test1/image2 2026-03-25T15:33:24.160 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd group rm rbd/test1/group1 2026-03-25T15:33:24.198 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash move rbd/test1/image1 2026-03-25T15:33:24.265 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash --namespace test1 ls 2026-03-25T15:33:24.265 INFO:tasks.workunit.client.0.vm04.stderr:++ cut -d ' ' -f 1 2026-03-25T15:33:24.293 INFO:tasks.workunit.client.0.vm04.stderr:+ ID=28c3786513fe 2026-03-25T15:33:24.293 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash rm rbd/test1/28c3786513fe 2026-03-25T15:33:24.669 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 1% complete... Removing image: 2% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 5% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 8% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 11% complete... Removing image: 12% complete... Removing image: 13% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 16% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 19% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 22% complete... Removing image: 23% complete... Removing image: 24% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 27% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 30% complete... Removing image: 31% complete...2026-03-25T15:33:24.668+0000 7f5cc1ace640 0 -- 192.168.123.104:0/1367301038 >> [v2:192.168.123.104:6816/2768043570,v1:192.168.123.104:6817/2768043570] conn(0x7f5ca4006690 msgr2=0x7f5ca4026a70 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:33:24.725 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 32% complete... Removing image: 33% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 36% complete... Removing image: 37% complete... Removing image: 38% complete... Removing image: 39% complete... Removing image: 40% complete...2026-03-25T15:33:24.724+0000 7f5cc3558640 0 -- 192.168.123.104:0/1367301038 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x55825b70f7c0 msgr2=0x55825b7b38f0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:33:25.089 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 41% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 44% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 47% complete... Removing image: 48% complete... Removing image: 49% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 52% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 55% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 58% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 61% complete... Removing image: 62% complete... Removing image: 63% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 66% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 69% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 72% complete... Removing image: 73% complete... Removing image: 74% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 77% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 80% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 83% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 86% complete... Removing image: 87% complete... Removing image: 88% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 91% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 94% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 97% complete... Removing image: 98% complete... Removing image: 99% complete... Removing image: 100% complete...done. 2026-03-25T15:33:25.095 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd remove rbd/test1/image2 2026-03-25T15:33:25.326 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 1% complete... Removing image: 2% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 5% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 8% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 11% complete... Removing image: 12% complete... Removing image: 13% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 16% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 19% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 22% complete... Removing image: 23% complete... Removing image: 24% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 27% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 30% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 33% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 36% complete... Removing image: 37% complete... Removing image: 38% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 41% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 44% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 47% complete... Removing image: 48% complete... Removing image: 49% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 52% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 55% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 58% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 61% complete... Removing image: 62% complete... Removing image: 63% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 66% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 69% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 72% complete... Removing image: 73% complete... Removing image: 74% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 77% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 80% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 83% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 86% complete... Removing image: 87% complete... Removing image: 88% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 91% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 94% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 97% complete... Removing image: 98% complete... Removing image: 99% complete...2026-03-25T15:33:25.324+0000 7f1cb007d640 0 -- 192.168.123.104:0/666593223 >> [v2:192.168.123.104:6816/2768043570,v1:192.168.123.104:6817/2768043570] conn(0x562f22252a80 msgr2=0x562f22272f00 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:33:25.340 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:33:25.344 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd namespace remove --pool rbd --namespace test1 2026-03-25T15:33:25.571 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:33:25.570+0000 7fa51d4ee640 0 --2- 192.168.123.104:0/2006258284 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x55b702ab4780 0x55b702aa40a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 2026-03-25T15:33:25.607 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd namespace remove --namespace test3 2026-03-25T15:33:25.689 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd namespace list 2026-03-25T15:33:25.689 INFO:tasks.workunit.client.0.vm04.stderr:+ grep test 2026-03-25T15:33:25.689 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:33:25.689 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^1$' 2026-03-25T15:33:25.717 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-25T15:33:25.717 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd namespace remove rbd/test2 2026-03-25T15:33:25.786 INFO:tasks.workunit.client.0.vm04.stderr:+ test_trash_purge_schedule 2026-03-25T15:33:25.786 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing trash purge schedule...' 2026-03-25T15:33:25.786 INFO:tasks.workunit.client.0.vm04.stdout:testing trash purge schedule... 2026-03-25T15:33:25.786 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-25T15:33:25.786 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:33:26.107 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:33:26.224 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:33:26.358 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:33:26.469 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:33:26.582 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:33:26.677 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:33:26.775 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:33:26.867 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:33:26.962 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:33:27.053 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:33:27.141 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:33:27.217 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:33:27.297 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:33:27.382 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:33:27.469 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:33:27.559 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:33:27.648 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:33:27.737 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool create rbd2 8 2026-03-25T15:33:28.879 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd2' already exists 2026-03-25T15:33:28.891 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd pool init rbd2 2026-03-25T15:33:31.885 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd namespace create rbd2/ns1 2026-03-25T15:33:31.947 INFO:tasks.workunit.client.0.vm04.stderr:++ ceph rbd trash purge schedule list 2026-03-25T15:33:32.291 INFO:tasks.workunit.client.0.vm04.stderr:+ test '{}' = '{}' 2026-03-25T15:33:32.291 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph rbd trash purge schedule status 2026-03-25T15:33:32.291 INFO:tasks.workunit.client.0.vm04.stderr:+ fgrep '"scheduled": []' 2026-03-25T15:33:32.628 INFO:tasks.workunit.client.0.vm04.stdout: "scheduled": [] 2026-03-25T15:33:32.629 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd trash purge schedule ls 2026-03-25T15:33:32.629 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls 2026-03-25T15:33:32.661 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:33:32.662 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash purge schedule ls -R --format json 2026-03-25T15:33:32.700 INFO:tasks.workunit.client.0.vm04.stderr:+ test '[]' = '[]' 2026-03-25T15:33:32.700 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd trash purge schedule remove dummy 2026-03-25T15:33:32.700 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule remove dummy 2026-03-25T15:33:32.732 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-25T15:33:32.731+0000 7fea55b89640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-25T15:33:32.733 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd trash purge schedule remove failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-25T15:33:32.737 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:33:32.737 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd trash purge schedule remove 1d dummy 2026-03-25T15:33:32.737 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule remove 1d dummy 2026-03-25T15:33:32.770 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-25T15:33:32.769+0000 7fea55b89640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-25T15:33:32.771 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd trash purge schedule remove failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-25T15:33:32.773 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:33:32.773 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd trash purge schedule remove -p rbd dummy 2026-03-25T15:33:32.773 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule remove -p rbd dummy 2026-03-25T15:33:32.805 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-25T15:33:32.803+0000 7fea55b89640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-25T15:33:32.805 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd trash purge schedule remove failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-25T15:33:32.808 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:33:32.808 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd trash purge schedule remove -p rbd 1d dummy 2026-03-25T15:33:32.809 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule remove -p rbd 1d dummy 2026-03-25T15:33:32.842 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-25T15:33:32.840+0000 7fea55b89640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-25T15:33:32.842 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd trash purge schedule remove failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-25T15:33:32.846 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:33:32.846 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule add -p rbd 1d 01:30 2026-03-25T15:33:32.892 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls -p rbd 2026-03-25T15:33:32.892 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 1d starting at 01:30' 2026-03-25T15:33:32.929 INFO:tasks.workunit.client.0.vm04.stdout:every 1d starting at 01:30:00 2026-03-25T15:33:32.929 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd trash purge schedule ls 2026-03-25T15:33:32.929 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls 2026-03-25T15:33:32.963 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:33:32.963 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls -R 2026-03-25T15:33:32.963 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 1d starting at 01:30' 2026-03-25T15:33:33.004 INFO:tasks.workunit.client.0.vm04.stdout:rbd - every 1d starting at 01:30:00 2026-03-25T15:33:33.005 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls -R -p rbd 2026-03-25T15:33:33.005 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 1d starting at 01:30' 2026-03-25T15:33:33.050 INFO:tasks.workunit.client.0.vm04.stdout:rbd - every 1d starting at 01:30:00 2026-03-25T15:33:33.050 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd trash purge schedule ls -p rbd2 2026-03-25T15:33:33.050 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls -p rbd2 2026-03-25T15:33:33.100 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:33:33.100 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash purge schedule ls -p rbd2 -R --format json 2026-03-25T15:33:33.145 INFO:tasks.workunit.client.0.vm04.stderr:+ test '[]' = '[]' 2026-03-25T15:33:33.145 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule add -p rbd2/ns1 2d 2026-03-25T15:33:33.198 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash purge schedule ls -p rbd2 -R --format json 2026-03-25T15:33:33.243 INFO:tasks.workunit.client.0.vm04.stderr:+ test '[{"pool":"rbd2","namespace":"ns1","items":[{"interval":"2d","start_time":""}]}]' '!=' '[]' 2026-03-25T15:33:33.243 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls -p rbd2 -R 2026-03-25T15:33:33.243 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'rbd2 *ns1 *every 2d' 2026-03-25T15:33:33.283 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 ns1 every 2d 2026-03-25T15:33:33.283 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule rm -p rbd2/ns1 2026-03-25T15:33:33.325 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash purge schedule ls -p rbd2 -R --format json 2026-03-25T15:33:33.366 INFO:tasks.workunit.client.0.vm04.stderr:+ test '[]' = '[]' 2026-03-25T15:33:33.366 INFO:tasks.workunit.client.0.vm04.stderr:++ seq 12 2026-03-25T15:33:33.367 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-25T15:33:33.367 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash purge schedule status --format xml 2026-03-25T15:33:33.368 INFO:tasks.workunit.client.0.vm04.stderr:++ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-25T15:33:33.418 INFO:tasks.workunit.client.0.vm04.stderr:+ test '' = rbd 2026-03-25T15:33:33.418 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:33:43.419 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-25T15:33:43.419 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash purge schedule status --format xml 2026-03-25T15:33:43.420 INFO:tasks.workunit.client.0.vm04.stderr:++ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-25T15:33:43.454 INFO:tasks.workunit.client.0.vm04.stderr:+ test '' = rbd 2026-03-25T15:33:43.454 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:33:53.455 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-25T15:33:53.456 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash purge schedule status --format xml 2026-03-25T15:33:53.456 INFO:tasks.workunit.client.0.vm04.stderr:++ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-25T15:33:53.490 INFO:tasks.workunit.client.0.vm04.stderr:+ test rbd = rbd 2026-03-25T15:33:53.491 INFO:tasks.workunit.client.0.vm04.stderr:+ break 2026-03-25T15:33:53.491 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule status 2026-03-25T15:33:53.521 INFO:tasks.workunit.client.0.vm04.stdout:POOL NAMESPACE SCHEDULE TIME 2026-03-25T15:33:53.521 INFO:tasks.workunit.client.0.vm04.stdout:rbd 2026-03-26 01:30:00 2026-03-25T15:33:53.525 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash purge schedule status --format xml 2026-03-25T15:33:53.525 INFO:tasks.workunit.client.0.vm04.stderr:++ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-25T15:33:53.557 INFO:tasks.workunit.client.0.vm04.stderr:+ test rbd = rbd 2026-03-25T15:33:53.558 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash purge schedule status -p rbd --format xml 2026-03-25T15:33:53.558 INFO:tasks.workunit.client.0.vm04.stderr:++ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-25T15:33:53.595 INFO:tasks.workunit.client.0.vm04.stderr:+ test rbd = rbd 2026-03-25T15:33:53.595 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule add 2d 00:17 2026-03-25T15:33:53.632 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls 2026-03-25T15:33:53.632 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 2d starting at 00:17' 2026-03-25T15:33:53.661 INFO:tasks.workunit.client.0.vm04.stdout:every 2d starting at 00:17:00 2026-03-25T15:33:53.662 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls -R 2026-03-25T15:33:53.662 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 2d starting at 00:17' 2026-03-25T15:33:53.692 INFO:tasks.workunit.client.0.vm04.stdout:- - every 2d starting at 00:17:00 2026-03-25T15:33:53.692 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd trash purge schedule ls -p rbd2 2026-03-25T15:33:53.692 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls -p rbd2 2026-03-25T15:33:53.723 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:33:53.723 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls -p rbd2 -R 2026-03-25T15:33:53.723 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 2d starting at 00:17' 2026-03-25T15:33:53.755 INFO:tasks.workunit.client.0.vm04.stdout:- - every 2d starting at 00:17:00 2026-03-25T15:33:53.755 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls -p rbd2/ns1 -R 2026-03-25T15:33:53.755 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 2d starting at 00:17' 2026-03-25T15:33:53.793 INFO:tasks.workunit.client.0.vm04.stdout:- - every 2d starting at 00:17:00 2026-03-25T15:33:53.793 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash purge schedule ls -R -p rbd2/ns1 --format xml 2026-03-25T15:33:53.793 INFO:tasks.workunit.client.0.vm04.stderr:++ xmlstarlet sel -t -v //schedules/schedule/pool 2026-03-25T15:33:53.831 INFO:tasks.workunit.client.0.vm04.stderr:+ test - = - 2026-03-25T15:33:53.831 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash purge schedule ls -R -p rbd2/ns1 --format xml 2026-03-25T15:33:53.831 INFO:tasks.workunit.client.0.vm04.stderr:++ xmlstarlet sel -t -v //schedules/schedule/namespace 2026-03-25T15:33:53.869 INFO:tasks.workunit.client.0.vm04.stderr:+ test - = - 2026-03-25T15:33:53.869 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash purge schedule ls -R -p rbd2/ns1 --format xml 2026-03-25T15:33:53.870 INFO:tasks.workunit.client.0.vm04.stderr:++ xmlstarlet sel -t -v //schedules/schedule/items/item/start_time 2026-03-25T15:33:53.908 INFO:tasks.workunit.client.0.vm04.stderr:+ test 00:17:00 = 00:17:00 2026-03-25T15:33:53.908 INFO:tasks.workunit.client.0.vm04.stderr:++ seq 12 2026-03-25T15:33:53.908 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-25T15:33:53.909 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule status --format xml 2026-03-25T15:33:53.909 INFO:tasks.workunit.client.0.vm04.stderr:+ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-25T15:33:53.909 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2 2026-03-25T15:33:53.942 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:34:03.943 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-25T15:34:03.943 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule status --format xml 2026-03-25T15:34:03.943 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2 2026-03-25T15:34:03.944 INFO:tasks.workunit.client.0.vm04.stderr:+ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-25T15:34:03.977 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:34:13.978 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-25T15:34:13.978 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule status --format xml 2026-03-25T15:34:13.978 INFO:tasks.workunit.client.0.vm04.stderr:+ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-25T15:34:13.978 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2 2026-03-25T15:34:14.008 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:34:24.010 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-25T15:34:24.010 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule status --format xml 2026-03-25T15:34:24.010 INFO:tasks.workunit.client.0.vm04.stderr:+ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-25T15:34:24.010 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2 2026-03-25T15:34:24.039 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:34:34.041 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-25T15:34:34.041 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule status --format xml 2026-03-25T15:34:34.041 INFO:tasks.workunit.client.0.vm04.stderr:+ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-25T15:34:34.041 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2 2026-03-25T15:34:34.072 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:34:44.074 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-25T15:34:44.074 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule status --format xml 2026-03-25T15:34:44.074 INFO:tasks.workunit.client.0.vm04.stderr:+ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-25T15:34:44.074 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2 2026-03-25T15:34:44.107 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:34:54.109 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-25T15:34:54.109 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule status --format xml 2026-03-25T15:34:54.109 INFO:tasks.workunit.client.0.vm04.stderr:+ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-25T15:34:54.109 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2 2026-03-25T15:34:54.141 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 2026-03-25T15:34:54.142 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 2026-03-25T15:34:54.142 INFO:tasks.workunit.client.0.vm04.stderr:+ break 2026-03-25T15:34:54.142 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule status 2026-03-25T15:34:54.168 INFO:tasks.workunit.client.0.vm04.stdout:POOL NAMESPACE SCHEDULE TIME 2026-03-25T15:34:54.168 INFO:tasks.workunit.client.0.vm04.stdout:rbd 2026-03-26 01:30:00 2026-03-25T15:34:54.168 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 2026-03-26 00:17:00 2026-03-25T15:34:54.168 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 ns1 2026-03-26 00:17:00 2026-03-25T15:34:54.170 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule status --format xml 2026-03-25T15:34:54.171 INFO:tasks.workunit.client.0.vm04.stderr:+ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-25T15:34:54.171 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2 2026-03-25T15:34:54.205 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 2026-03-25T15:34:54.205 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 2026-03-25T15:34:54.205 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'rbd rbd2 rbd2' 2026-03-25T15:34:54.206 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash purge schedule status --format xml 2026-03-25T15:34:54.206 INFO:tasks.workunit.client.0.vm04.stderr:++ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-25T15:34:54.237 INFO:tasks.workunit.client.0.vm04.stderr:+ echo rbd rbd2 rbd2 2026-03-25T15:34:54.238 INFO:tasks.workunit.client.0.vm04.stdout:rbd rbd2 rbd2 2026-03-25T15:34:54.238 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash purge schedule status -p rbd --format xml 2026-03-25T15:34:54.238 INFO:tasks.workunit.client.0.vm04.stderr:++ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-25T15:34:54.271 INFO:tasks.workunit.client.0.vm04.stderr:+ test rbd = rbd 2026-03-25T15:34:54.271 INFO:tasks.workunit.client.0.vm04.stderr:+++ rbd trash purge schedule status -p rbd2 --format xml 2026-03-25T15:34:54.272 INFO:tasks.workunit.client.0.vm04.stderr:+++ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-25T15:34:54.305 INFO:tasks.workunit.client.0.vm04.stderr:++ echo rbd2 rbd2 2026-03-25T15:34:54.305 INFO:tasks.workunit.client.0.vm04.stderr:+ test 'rbd2 rbd2' = 'rbd2 rbd2' 2026-03-25T15:34:54.306 INFO:tasks.workunit.client.0.vm04.stderr:+++ rbd trash purge schedule ls -R --format xml 2026-03-25T15:34:54.306 INFO:tasks.workunit.client.0.vm04.stderr:+++ xmlstarlet sel -t -v //schedules/schedule/items 2026-03-25T15:34:54.340 INFO:tasks.workunit.client.0.vm04.stderr:++ echo 2d00:17:00 1d01:30:00 2026-03-25T15:34:54.341 INFO:tasks.workunit.client.0.vm04.stderr:+ test '2d00:17:00 1d01:30:00' = '2d00:17:00 1d01:30:00' 2026-03-25T15:34:54.341 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule add 1d 2026-03-25T15:34:54.379 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls 2026-03-25T15:34:54.379 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 2d starting at 00:17' 2026-03-25T15:34:54.409 INFO:tasks.workunit.client.0.vm04.stdout:every 1d, every 2d starting at 00:17:00 2026-03-25T15:34:54.410 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls 2026-03-25T15:34:54.410 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 1d' 2026-03-25T15:34:54.438 INFO:tasks.workunit.client.0.vm04.stdout:every 1d, every 2d starting at 00:17:00 2026-03-25T15:34:54.438 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls -R --format xml 2026-03-25T15:34:54.438 INFO:tasks.workunit.client.0.vm04.stderr:+ xmlstarlet sel -t -v //schedules/schedule/items 2026-03-25T15:34:54.438 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 2d00:17 2026-03-25T15:34:54.473 INFO:tasks.workunit.client.0.vm04.stdout:1d2d00:17:00 2026-03-25T15:34:54.473 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule rm 1d 2026-03-25T15:34:54.511 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls 2026-03-25T15:34:54.511 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 2d starting at 00:17' 2026-03-25T15:34:54.539 INFO:tasks.workunit.client.0.vm04.stdout:every 2d starting at 00:17:00 2026-03-25T15:34:54.539 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule rm 2d 00:17 2026-03-25T15:34:54.575 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd trash purge schedule ls 2026-03-25T15:34:54.575 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls 2026-03-25T15:34:54.603 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:34:54.603 INFO:tasks.workunit.client.0.vm04.stderr:+ for p in rbd2 rbd2/ns1 2026-03-25T15:34:54.603 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 -s 1 rbd2/ns1/test1 2026-03-25T15:34:54.645 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv rbd2/ns1/test1 2026-03-25T15:34:54.699 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls rbd2/ns1 2026-03-25T15:34:54.699 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:34:54.699 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^1$' 2026-03-25T15:34:54.733 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-25T15:34:54.734 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule add -p rbd2 1m 2026-03-25T15:34:54.773 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule list -p rbd2 -R 2026-03-25T15:34:54.774 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 1m' 2026-03-25T15:34:54.805 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 - every 1m 2026-03-25T15:34:54.805 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule list -p rbd2/ns1 -R 2026-03-25T15:34:54.805 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 1m' 2026-03-25T15:34:54.843 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 - every 1m 2026-03-25T15:34:54.844 INFO:tasks.workunit.client.0.vm04.stderr:++ seq 12 2026-03-25T15:34:54.845 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-25T15:34:54.845 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls rbd2/ns1 2026-03-25T15:34:54.845 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:34:54.846 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^1$' 2026-03-25T15:34:54.882 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-25T15:34:54.883 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:35:04.885 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-25T15:35:04.885 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls rbd2/ns1 2026-03-25T15:35:04.885 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:35:04.885 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^1$' 2026-03-25T15:35:04.928 INFO:tasks.workunit.client.0.vm04.stderr:+ break 2026-03-25T15:35:04.928 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls rbd2/ns1 2026-03-25T15:35:04.929 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:35:04.929 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^0$' 2026-03-25T15:35:04.970 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-25T15:35:04.971 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule list -p rbd2 -R 2026-03-25T15:35:04.971 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 1m' 2026-03-25T15:35:05.021 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 - every 1m 2026-03-25T15:35:05.021 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule list -p rbd2/ns1 -R 2026-03-25T15:35:05.021 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 1m' 2026-03-25T15:35:05.058 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 - every 1m 2026-03-25T15:35:05.058 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule status 2026-03-25T15:35:05.058 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'rbd2 *ns1' 2026-03-25T15:35:05.094 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 ns1 2026-03-25 15:36:00 2026-03-25T15:35:05.094 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule status -p rbd2 2026-03-25T15:35:05.094 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'rbd2 *ns1' 2026-03-25T15:35:05.131 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 ns1 2026-03-25 15:36:00 2026-03-25T15:35:05.132 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule status -p rbd2/ns1 2026-03-25T15:35:05.132 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'rbd2 *ns1' 2026-03-25T15:35:05.166 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 ns1 2026-03-25 15:36:00 2026-03-25T15:35:05.167 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule rm -p rbd2 1m 2026-03-25T15:35:05.213 INFO:tasks.workunit.client.0.vm04.stderr:+ for p in rbd2 rbd2/ns1 2026-03-25T15:35:05.213 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 -s 1 rbd2/ns1/test1 2026-03-25T15:35:05.323 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash mv rbd2/ns1/test1 2026-03-25T15:35:05.463 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls rbd2/ns1 2026-03-25T15:35:05.463 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:35:05.463 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^1$' 2026-03-25T15:35:05.494 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-25T15:35:05.494 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule add -p rbd2/ns1 1m 2026-03-25T15:35:05.556 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule list -p rbd2 -R 2026-03-25T15:35:05.557 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 1m' 2026-03-25T15:35:05.598 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 ns1 every 1m 2026-03-25T15:35:05.598 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule list -p rbd2/ns1 -R 2026-03-25T15:35:05.598 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 1m' 2026-03-25T15:35:05.639 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 ns1 every 1m 2026-03-25T15:35:05.640 INFO:tasks.workunit.client.0.vm04.stderr:++ seq 12 2026-03-25T15:35:05.641 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-25T15:35:05.641 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls rbd2/ns1 2026-03-25T15:35:05.641 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:35:05.641 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^1$' 2026-03-25T15:35:05.681 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-25T15:35:05.682 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:35:15.683 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-25T15:35:15.683 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls rbd2/ns1 2026-03-25T15:35:15.683 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:35:15.684 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^1$' 2026-03-25T15:35:15.717 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-25T15:35:15.717 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:35:25.718 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-25T15:35:25.719 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls rbd2/ns1 2026-03-25T15:35:25.719 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:35:25.719 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^1$' 2026-03-25T15:35:25.750 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-25T15:35:25.750 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:35:35.751 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-25T15:35:35.751 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls rbd2/ns1 2026-03-25T15:35:35.751 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:35:35.751 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^1$' 2026-03-25T15:35:35.778 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-25T15:35:35.778 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:35:45.779 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-25T15:35:45.779 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls rbd2/ns1 2026-03-25T15:35:45.779 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:35:45.779 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^1$' 2026-03-25T15:35:45.810 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-25T15:35:45.810 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:35:55.811 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-25T15:35:55.811 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls rbd2/ns1 2026-03-25T15:35:55.812 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:35:55.812 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^1$' 2026-03-25T15:35:55.846 INFO:tasks.workunit.client.0.vm04.stdout:1 2026-03-25T15:35:55.846 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:36:05.848 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-25T15:36:05.848 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls rbd2/ns1 2026-03-25T15:36:05.848 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:36:05.848 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^1$' 2026-03-25T15:36:05.878 INFO:tasks.workunit.client.0.vm04.stderr:+ break 2026-03-25T15:36:05.879 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash ls rbd2/ns1 2026-03-25T15:36:05.879 INFO:tasks.workunit.client.0.vm04.stderr:+ wc -l 2026-03-25T15:36:05.879 INFO:tasks.workunit.client.0.vm04.stderr:+ grep '^0$' 2026-03-25T15:36:05.911 INFO:tasks.workunit.client.0.vm04.stdout:0 2026-03-25T15:36:05.911 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule list -p rbd2 -R 2026-03-25T15:36:05.911 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 1m' 2026-03-25T15:36:05.947 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 ns1 every 1m 2026-03-25T15:36:05.947 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule list -p rbd2/ns1 -R 2026-03-25T15:36:05.947 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 1m' 2026-03-25T15:36:05.981 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 ns1 every 1m 2026-03-25T15:36:05.982 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule status 2026-03-25T15:36:05.982 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'rbd2 *ns1' 2026-03-25T15:36:06.012 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 ns1 2026-03-25 15:37:00 2026-03-25T15:36:06.012 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule status -p rbd2 2026-03-25T15:36:06.012 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'rbd2 *ns1' 2026-03-25T15:36:06.044 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 ns1 2026-03-25 15:37:00 2026-03-25T15:36:06.045 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule status -p rbd2/ns1 2026-03-25T15:36:06.045 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'rbd2 *ns1' 2026-03-25T15:36:06.081 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 ns1 2026-03-25 15:37:00 2026-03-25T15:36:06.081 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule rm -p rbd2/ns1 1m 2026-03-25T15:36:06.123 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule add 2m 2026-03-25T15:36:06.166 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd trash purge schedule add -p rbd dummy 2026-03-25T15:36:06.166 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule add -p rbd dummy 2026-03-25T15:36:06.196 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-25T15:36:06.194+0000 7fea55b89640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-25T15:36:06.196 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd trash purge schedule add failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-25T15:36:06.199 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:36:06.200 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd trash purge schedule add -p rbd 1d dummy 2026-03-25T15:36:06.200 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule add -p rbd 1d dummy 2026-03-25T15:36:06.231 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-25T15:36:06.230+0000 7fea55b89640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-25T15:36:06.231 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd trash purge schedule add failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-25T15:36:06.234 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:36:06.234 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd trash purge schedule add dummy 2026-03-25T15:36:06.234 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule add dummy 2026-03-25T15:36:06.263 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-25T15:36:06.261+0000 7fea55b89640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-25T15:36:06.263 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd trash purge schedule add failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-25T15:36:06.266 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:36:06.266 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd trash purge schedule add 1d dummy 2026-03-25T15:36:06.266 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule add 1d dummy 2026-03-25T15:36:06.292 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-25T15:36:06.291+0000 7fea55b89640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-25T15:36:06.293 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd trash purge schedule add failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-25T15:36:06.295 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:36:06.295 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd trash purge schedule remove -p rbd dummy 2026-03-25T15:36:06.295 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule remove -p rbd dummy 2026-03-25T15:36:06.324 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-25T15:36:06.322+0000 7fea55b89640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-25T15:36:06.324 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd trash purge schedule remove failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-25T15:36:06.326 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:36:06.326 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd trash purge schedule remove -p rbd 1d dummy 2026-03-25T15:36:06.326 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule remove -p rbd 1d dummy 2026-03-25T15:36:06.358 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-25T15:36:06.357+0000 7fea55b89640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-25T15:36:06.359 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd trash purge schedule remove failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-25T15:36:06.362 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:36:06.362 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd trash purge schedule remove dummy 2026-03-25T15:36:06.362 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule remove dummy 2026-03-25T15:36:06.390 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-25T15:36:06.389+0000 7fea55b89640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-25T15:36:06.390 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd trash purge schedule remove failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-25T15:36:06.394 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:36:06.394 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd trash purge schedule remove 1d dummy 2026-03-25T15:36:06.394 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule remove 1d dummy 2026-03-25T15:36:06.422 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-25T15:36:06.420+0000 7fea55b89640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-25T15:36:06.423 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd trash purge schedule remove failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-25T15:36:06.425 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:36:06.426 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls -p rbd 2026-03-25T15:36:06.426 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 1d starting at 01:30' 2026-03-25T15:36:06.460 INFO:tasks.workunit.client.0.vm04.stdout:every 1d starting at 01:30:00 2026-03-25T15:36:06.461 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls 2026-03-25T15:36:06.461 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 2m' 2026-03-25T15:36:06.491 INFO:tasks.workunit.client.0.vm04.stdout:every 2m 2026-03-25T15:36:06.491 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule remove -p rbd 1d 01:30 2026-03-25T15:36:06.530 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule remove 2m 2026-03-25T15:36:06.574 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd trash purge schedule ls -R --format json 2026-03-25T15:36:06.609 INFO:tasks.workunit.client.0.vm04.stderr:+ test '[]' = '[]' 2026-03-25T15:36:06.609 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-25T15:36:06.609 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:06.707 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:06.806 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:07.115 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:07.225 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:07.326 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:07.427 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:07.527 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:07.640 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:07.750 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:07.849 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:07.952 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:08.056 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:08.162 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:08.262 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:08.377 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:08.479 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:08.582 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:08.703 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool rm rbd2 rbd2 --yes-i-really-really-mean-it 2026-03-25T15:36:09.055 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd2' does not exist 2026-03-25T15:36:09.066 INFO:tasks.workunit.client.0.vm04.stderr:+ test_trash_purge_schedule_recovery 2026-03-25T15:36:09.066 INFO:tasks.workunit.client.0.vm04.stdout:testing recovery of trash_purge_schedule handler after module's RADOS client is blocklisted... 2026-03-25T15:36:09.066 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing recovery of trash_purge_schedule handler after module'\''s RADOS client is blocklisted...' 2026-03-25T15:36:09.066 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-25T15:36:09.066 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:09.201 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:09.346 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:09.448 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:09.552 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:09.656 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:09.757 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:09.855 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:09.955 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:10.061 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:10.172 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:10.270 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:10.368 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:10.468 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:10.568 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:10.896 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:11.001 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:11.107 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:11.211 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool create rbd3 8 2026-03-25T15:36:12.073 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd3' already exists 2026-03-25T15:36:12.083 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd pool init rbd3 2026-03-25T15:36:15.025 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd namespace create rbd3/ns1 2026-03-25T15:36:15.061 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule add -p rbd3/ns1 2d 2026-03-25T15:36:15.100 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls -p rbd3 -R 2026-03-25T15:36:15.100 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'rbd3 *ns1 *every 2d' 2026-03-25T15:36:15.132 INFO:tasks.workunit.client.0.vm04.stdout:rbd3 ns1 every 2d 2026-03-25T15:36:15.133 INFO:tasks.workunit.client.0.vm04.stderr:++ ceph mgr dump 2026-03-25T15:36:15.133 INFO:tasks.workunit.client.0.vm04.stderr:++ jq 'select(.name == "rbd_support")' 2026-03-25T15:36:15.133 INFO:tasks.workunit.client.0.vm04.stderr:++ jq '.active_clients[]' 2026-03-25T15:36:15.133 INFO:tasks.workunit.client.0.vm04.stderr:++ jq -r '[.addrvec[0].addr, "/", .addrvec[0].nonce|tostring] | add' 2026-03-25T15:36:15.478 INFO:tasks.workunit.client.0.vm04.stderr:+ CLIENT_ADDR=192.168.123.104:0/3897111736 2026-03-25T15:36:15.478 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd blocklist add 192.168.123.104:0/3897111736 2026-03-25T15:36:17.023 INFO:tasks.workunit.client.0.vm04.stderr:blocklisting 192.168.123.104:0/3897111736 until 2026-03-25T16:36:16.109606+0000 (3600 sec) 2026-03-25T15:36:17.036 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd trash purge schedule add -p rbd3 10m 2026-03-25T15:36:17.036 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule add -p rbd3 10m 2026-03-25T15:36:17.066 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-25T15:36:17.065+0000 7fea55b89640 -1 mgr.server reply reply (11) Resource temporarily unavailable [errno 108] RADOS connection was shutdown (Failed to operate write op for oid rbd_trash_purge_schedule) 2026-03-25T15:36:17.067 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd trash purge schedule add failed: (11) Resource temporarily unavailable: [errno 108] RADOS connection was shutdown (Failed to operate write op for oid rbd_trash_purge_schedule) 2026-03-25T15:36:17.070 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:36:17.070 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:36:27.072 INFO:tasks.workunit.client.0.vm04.stderr:++ seq 24 2026-03-25T15:36:27.072 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 24` 2026-03-25T15:36:27.072 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule add -p rbd3 10m 2026-03-25T15:36:27.101 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-25T15:36:27.100+0000 7fea55b89640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-25T15:36:27.101 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd trash purge schedule add failed: (11) Resource temporarily unavailable: rbd_support module is not ready, try again 2026-03-25T15:36:27.103 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:36:37.105 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 24` 2026-03-25T15:36:37.105 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule add -p rbd3 10m 2026-03-25T15:36:37.133 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-25T15:36:37.132+0000 7fea55b89640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-25T15:36:37.133 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd trash purge schedule add failed: (11) Resource temporarily unavailable: rbd_support module is not ready, try again 2026-03-25T15:36:37.137 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:36:47.138 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 24` 2026-03-25T15:36:47.138 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule add -p rbd3 10m 2026-03-25T15:36:47.175 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-25T15:36:47.174+0000 7fea55b89640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-25T15:36:47.176 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd trash purge schedule add failed: (11) Resource temporarily unavailable: rbd_support module is not ready, try again 2026-03-25T15:36:47.179 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:36:57.181 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 24` 2026-03-25T15:36:57.181 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule add -p rbd3 10m 2026-03-25T15:36:57.215 INFO:tasks.workunit.client.0.vm04.stderr:+ break 2026-03-25T15:36:57.216 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls -p rbd3 -R 2026-03-25T15:36:57.216 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 10m' 2026-03-25T15:36:57.247 INFO:tasks.workunit.client.0.vm04.stdout:rbd3 - every 10m 2026-03-25T15:36:57.247 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls -p rbd3 -R 2026-03-25T15:36:57.247 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'rbd3 *ns1 *every 2d' 2026-03-25T15:36:57.278 INFO:tasks.workunit.client.0.vm04.stdout:rbd3 ns1 every 2d 2026-03-25T15:36:57.279 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule remove -p rbd3 10m 2026-03-25T15:36:57.310 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule remove -p rbd3/ns1 2d 2026-03-25T15:36:57.342 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls -p rbd3 -R 2026-03-25T15:36:57.342 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail grep 'every 10m' 2026-03-25T15:36:57.342 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 10m' 2026-03-25T15:36:57.370 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:36:57.371 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd trash purge schedule ls -p rbd3 -R 2026-03-25T15:36:57.371 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail grep 'rbd3 *ns1 *every 2d' 2026-03-25T15:36:57.371 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'rbd3 *ns1 *every 2d' 2026-03-25T15:36:57.403 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:36:57.403 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool rm rbd3 rbd3 --yes-i-really-really-mean-it 2026-03-25T15:36:58.059 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd3' does not exist 2026-03-25T15:36:58.070 INFO:tasks.workunit.client.0.vm04.stderr:+ test_mirror_snapshot_schedule 2026-03-25T15:36:58.070 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing mirror snapshot schedule...' 2026-03-25T15:36:58.071 INFO:tasks.workunit.client.0.vm04.stdout:testing mirror snapshot schedule... 2026-03-25T15:36:58.071 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-25T15:36:58.071 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:58.181 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:58.282 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:58.406 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:58.500 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:58.590 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:58.681 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:58.780 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:58.875 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:58.963 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:59.055 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:59.142 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:59.272 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:59.364 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:59.456 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:59.546 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:59.634 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:59.729 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:36:59.832 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool create rbd2 8 2026-03-25T15:37:01.071 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd2' already exists 2026-03-25T15:37:01.081 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd pool init rbd2 2026-03-25T15:37:04.025 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd namespace create rbd2/ns1 2026-03-25T15:37:04.069 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror pool enable rbd2 image 2026-03-25T15:37:04.102 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror pool enable rbd2/ns1 image 2026-03-25T15:37:04.133 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror pool peer add rbd2 cluster1 2026-03-25T15:37:04.159 INFO:tasks.workunit.client.0.vm04.stdout:447e82d2-027c-404c-b24f-76aa9757c56d 2026-03-25T15:37:04.162 INFO:tasks.workunit.client.0.vm04.stderr:++ ceph rbd mirror snapshot schedule list 2026-03-25T15:37:04.472 INFO:tasks.workunit.client.0.vm04.stderr:+ test '{}' = '{}' 2026-03-25T15:37:04.472 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph rbd mirror snapshot schedule status 2026-03-25T15:37:04.472 INFO:tasks.workunit.client.0.vm04.stderr:+ fgrep '"scheduled_images": []' 2026-03-25T15:37:04.778 INFO:tasks.workunit.client.0.vm04.stdout: "scheduled_images": [] 2026-03-25T15:37:04.778 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd mirror snapshot schedule ls 2026-03-25T15:37:04.778 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls 2026-03-25T15:37:04.805 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:37:04.805 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror snapshot schedule ls -R --format json 2026-03-25T15:37:04.842 INFO:tasks.workunit.client.0.vm04.stderr:+ test '[]' = '[]' 2026-03-25T15:37:04.842 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 -s 1 rbd2/ns1/test1 2026-03-25T15:37:04.877 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror image status rbd2/ns1/test1 2026-03-25T15:37:04.877 INFO:tasks.workunit.client.0.vm04.stderr:++ grep -c mirror.primary 2026-03-25T15:37:04.905 INFO:tasks.workunit.client.0.vm04.stderr:rbd: mirroring not enabled on the image 2026-03-25T15:37:04.909 INFO:tasks.workunit.client.0.vm04.stderr:+ test 0 = 0 2026-03-25T15:37:04.909 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror image enable rbd2/ns1/test1 snapshot 2026-03-25T15:37:05.027 INFO:tasks.workunit.client.0.vm04.stdout:Mirroring enabled 2026-03-25T15:37:05.033 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror image status rbd2/ns1/test1 2026-03-25T15:37:05.033 INFO:tasks.workunit.client.0.vm04.stderr:++ grep -c mirror.primary 2026-03-25T15:37:05.066 INFO:tasks.workunit.client.0.vm04.stderr:+ test 1 = 1 2026-03-25T15:37:05.066 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd mirror snapshot schedule remove dummy 2026-03-25T15:37:05.066 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule remove dummy 2026-03-25T15:37:05.091 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-25T15:37:05.089+0000 7fea55b89640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-25T15:37:05.091 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd mirror snapshot schedule remove failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-25T15:37:05.093 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:37:05.093 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd mirror snapshot schedule remove 1h dummy 2026-03-25T15:37:05.093 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule remove 1h dummy 2026-03-25T15:37:05.118 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-25T15:37:05.117+0000 7fea55b89640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-25T15:37:05.119 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd mirror snapshot schedule remove failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-25T15:37:05.121 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:37:05.121 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd mirror snapshot schedule remove -p rbd2/ns1 --image test1 dummy 2026-03-25T15:37:05.121 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule remove -p rbd2/ns1 --image test1 dummy 2026-03-25T15:37:05.154 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-25T15:37:05.153+0000 7fea55b89640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-25T15:37:05.154 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd mirror snapshot schedule remove failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-25T15:37:05.156 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:37:05.156 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd mirror snapshot schedule remove -p rbd2/ns1 --image test1 1h dummy 2026-03-25T15:37:05.156 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule remove -p rbd2/ns1 --image test1 1h dummy 2026-03-25T15:37:05.190 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-25T15:37:05.189+0000 7fea55b89640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-25T15:37:05.190 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd mirror snapshot schedule remove failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-25T15:37:05.192 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:37:05.192 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule add -p rbd2/ns1 --image test1 1m 2026-03-25T15:37:05.229 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd mirror snapshot schedule ls 2026-03-25T15:37:05.229 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls 2026-03-25T15:37:05.256 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:37:05.256 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls -R 2026-03-25T15:37:05.256 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'rbd2 *ns1 *test1 *every 1m' 2026-03-25T15:37:05.282 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 ns1 test1 every 1m 2026-03-25T15:37:05.282 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd mirror snapshot schedule ls -p rbd2 2026-03-25T15:37:05.282 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls -p rbd2 2026-03-25T15:37:05.311 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:37:05.312 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls -p rbd2 -R 2026-03-25T15:37:05.312 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'rbd2 *ns1 *test1 *every 1m' 2026-03-25T15:37:05.344 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 ns1 test1 every 1m 2026-03-25T15:37:05.344 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd mirror snapshot schedule ls -p rbd2/ns1 2026-03-25T15:37:05.344 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls -p rbd2/ns1 2026-03-25T15:37:05.379 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:37:05.379 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls -p rbd2/ns1 -R 2026-03-25T15:37:05.379 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'rbd2 *ns1 *test1 *every 1m' 2026-03-25T15:37:05.412 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 ns1 test1 every 1m 2026-03-25T15:37:05.412 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror snapshot schedule ls -p rbd2/ns1 --image test1 2026-03-25T15:37:05.452 INFO:tasks.workunit.client.0.vm04.stderr:+ test 'every 1m' = 'every 1m' 2026-03-25T15:37:05.452 INFO:tasks.workunit.client.0.vm04.stderr:++ seq 12 2026-03-25T15:37:05.452 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-25T15:37:05.453 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror image status rbd2/ns1/test1 2026-03-25T15:37:05.453 INFO:tasks.workunit.client.0.vm04.stderr:++ grep -c mirror.primary 2026-03-25T15:37:05.495 INFO:tasks.workunit.client.0.vm04.stderr:+ test 1 -gt 1 2026-03-25T15:37:05.495 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:37:15.497 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-25T15:37:15.497 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror image status rbd2/ns1/test1 2026-03-25T15:37:15.497 INFO:tasks.workunit.client.0.vm04.stderr:++ grep -c mirror.primary 2026-03-25T15:37:15.549 INFO:tasks.workunit.client.0.vm04.stderr:+ test 1 -gt 1 2026-03-25T15:37:15.549 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:37:25.550 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-25T15:37:25.550 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror image status rbd2/ns1/test1 2026-03-25T15:37:25.550 INFO:tasks.workunit.client.0.vm04.stderr:++ grep -c mirror.primary 2026-03-25T15:37:25.584 INFO:tasks.workunit.client.0.vm04.stderr:+ test 1 -gt 1 2026-03-25T15:37:25.584 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:37:35.585 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-25T15:37:35.585 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror image status rbd2/ns1/test1 2026-03-25T15:37:35.586 INFO:tasks.workunit.client.0.vm04.stderr:++ grep -c mirror.primary 2026-03-25T15:37:35.623 INFO:tasks.workunit.client.0.vm04.stderr:+ test 1 -gt 1 2026-03-25T15:37:35.623 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:37:45.625 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-25T15:37:45.626 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror image status rbd2/ns1/test1 2026-03-25T15:37:45.626 INFO:tasks.workunit.client.0.vm04.stderr:++ grep -c mirror.primary 2026-03-25T15:37:45.680 INFO:tasks.workunit.client.0.vm04.stderr:+ test 1 -gt 1 2026-03-25T15:37:45.681 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:37:55.683 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-25T15:37:55.683 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror image status rbd2/ns1/test1 2026-03-25T15:37:55.683 INFO:tasks.workunit.client.0.vm04.stderr:++ grep -c mirror.primary 2026-03-25T15:37:55.726 INFO:tasks.workunit.client.0.vm04.stderr:+ test 1 -gt 1 2026-03-25T15:37:55.726 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:38:05.728 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-25T15:38:05.729 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror image status rbd2/ns1/test1 2026-03-25T15:38:05.729 INFO:tasks.workunit.client.0.vm04.stderr:++ grep -c mirror.primary 2026-03-25T15:38:05.775 INFO:tasks.workunit.client.0.vm04.stderr:+ test 2 -gt 1 2026-03-25T15:38:05.775 INFO:tasks.workunit.client.0.vm04.stderr:+ break 2026-03-25T15:38:05.775 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror image status rbd2/ns1/test1 2026-03-25T15:38:05.775 INFO:tasks.workunit.client.0.vm04.stderr:++ grep -c mirror.primary 2026-03-25T15:38:05.819 INFO:tasks.workunit.client.0.vm04.stderr:+ test 2 -gt 1 2026-03-25T15:38:05.819 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd mirror snapshot schedule ls 2026-03-25T15:38:05.819 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls 2026-03-25T15:38:05.853 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:38:05.853 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls -R 2026-03-25T15:38:05.853 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'rbd2 *ns1 *test1 *every 1m' 2026-03-25T15:38:05.887 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 ns1 test1 every 1m 2026-03-25T15:38:05.888 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd mirror snapshot schedule ls -p rbd2 2026-03-25T15:38:05.888 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls -p rbd2 2026-03-25T15:38:05.923 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:38:05.923 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls -p rbd2 -R 2026-03-25T15:38:05.924 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'rbd2 *ns1 *test1 *every 1m' 2026-03-25T15:38:05.957 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 ns1 test1 every 1m 2026-03-25T15:38:05.958 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd mirror snapshot schedule ls -p rbd2/ns1 2026-03-25T15:38:05.958 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls -p rbd2/ns1 2026-03-25T15:38:05.992 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:38:05.992 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls -p rbd2/ns1 -R 2026-03-25T15:38:05.992 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'rbd2 *ns1 *test1 *every 1m' 2026-03-25T15:38:06.029 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 ns1 test1 every 1m 2026-03-25T15:38:06.029 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror snapshot schedule ls -p rbd2/ns1 --image test1 2026-03-25T15:38:06.072 INFO:tasks.workunit.client.0.vm04.stderr:+ test 'every 1m' = 'every 1m' 2026-03-25T15:38:06.072 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-25T15:38:06.100 INFO:tasks.workunit.client.0.vm04.stdout:SCHEDULE TIME IMAGE 2026-03-25T15:38:06.100 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-25 15:39:00 rbd2/ns1/test1 2026-03-25T15:38:06.104 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror snapshot schedule status --format xml 2026-03-25T15:38:06.104 INFO:tasks.workunit.client.0.vm04.stderr:++ xmlstarlet sel -t -v //scheduled_images/image/image 2026-03-25T15:38:06.137 INFO:tasks.workunit.client.0.vm04.stderr:+ test rbd2/ns1/test1 = rbd2/ns1/test1 2026-03-25T15:38:06.138 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror snapshot schedule status -p rbd2 --format xml 2026-03-25T15:38:06.138 INFO:tasks.workunit.client.0.vm04.stderr:++ xmlstarlet sel -t -v //scheduled_images/image/image 2026-03-25T15:38:06.172 INFO:tasks.workunit.client.0.vm04.stderr:+ test rbd2/ns1/test1 = rbd2/ns1/test1 2026-03-25T15:38:06.173 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror snapshot schedule status -p rbd2/ns1 --format xml 2026-03-25T15:38:06.173 INFO:tasks.workunit.client.0.vm04.stderr:++ xmlstarlet sel -t -v //scheduled_images/image/image 2026-03-25T15:38:06.206 INFO:tasks.workunit.client.0.vm04.stderr:+ test rbd2/ns1/test1 = rbd2/ns1/test1 2026-03-25T15:38:06.207 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror snapshot schedule status -p rbd2/ns1 --image test1 --format xml 2026-03-25T15:38:06.207 INFO:tasks.workunit.client.0.vm04.stderr:++ xmlstarlet sel -t -v //scheduled_images/image/image 2026-03-25T15:38:06.249 INFO:tasks.workunit.client.0.vm04.stderr:+ test rbd2/ns1/test1 = rbd2/ns1/test1 2026-03-25T15:38:06.249 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror image demote rbd2/ns1/test1 2026-03-25T15:38:07.014 INFO:tasks.workunit.client.0.vm04.stdout:Image demoted to non-primary 2026-03-25T15:38:07.019 INFO:tasks.workunit.client.0.vm04.stderr:++ seq 12 2026-03-25T15:38:07.019 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-25T15:38:07.019 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-25T15:38:07.020 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/ns1/test1 2026-03-25T15:38:07.046 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-25 15:39:00 rbd2/ns1/test1 2026-03-25T15:38:07.046 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:38:17.047 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-25T15:38:17.047 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-25T15:38:17.047 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/ns1/test1 2026-03-25T15:38:17.087 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-25 15:39:00 rbd2/ns1/test1 2026-03-25T15:38:17.087 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:38:27.088 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-25T15:38:27.089 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-25T15:38:27.089 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/ns1/test1 2026-03-25T15:38:27.119 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-25 15:39:00 rbd2/ns1/test1 2026-03-25T15:38:27.120 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:38:37.121 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-25T15:38:37.121 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-25T15:38:37.121 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/ns1/test1 2026-03-25T15:38:37.156 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-25 15:39:00 rbd2/ns1/test1 2026-03-25T15:38:37.156 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:38:47.158 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-25T15:38:47.158 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-25T15:38:47.158 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/ns1/test1 2026-03-25T15:38:47.189 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-25 15:39:00 rbd2/ns1/test1 2026-03-25T15:38:47.189 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:38:57.190 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-25T15:38:57.190 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-25T15:38:57.190 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/ns1/test1 2026-03-25T15:38:57.218 INFO:tasks.workunit.client.0.vm04.stderr:+ break 2026-03-25T15:38:57.219 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-25T15:38:57.219 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail grep rbd2/ns1/test1 2026-03-25T15:38:57.219 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/ns1/test1 2026-03-25T15:38:57.247 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:38:57.248 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror image promote rbd2/ns1/test1 2026-03-25T15:38:58.021 INFO:tasks.workunit.client.0.vm04.stdout:Image promoted to primary 2026-03-25T15:38:58.037 INFO:tasks.workunit.client.0.vm04.stderr:++ seq 12 2026-03-25T15:38:58.038 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-25T15:38:58.038 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-25T15:38:58.038 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/ns1/test1 2026-03-25T15:38:58.082 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:39:08.083 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-25T15:39:08.083 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-25T15:39:08.083 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/ns1/test1 2026-03-25T15:39:08.111 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:39:18.112 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-25T15:39:18.112 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-25T15:39:18.112 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/ns1/test1 2026-03-25T15:39:18.144 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:39:28.145 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-25T15:39:28.145 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-25T15:39:28.145 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/ns1/test1 2026-03-25T15:39:28.176 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:39:38.178 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-25T15:39:38.178 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-25T15:39:38.178 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/ns1/test1 2026-03-25T15:39:38.204 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:39:48.206 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-25T15:39:48.206 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-25T15:39:48.206 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/ns1/test1 2026-03-25T15:39:48.236 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:39:58.238 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-25T15:39:58.238 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-25T15:39:58.239 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/ns1/test1 2026-03-25T15:39:58.277 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-25 15:40:00 rbd2/ns1/test1 2026-03-25T15:39:58.277 INFO:tasks.workunit.client.0.vm04.stderr:+ break 2026-03-25T15:39:58.277 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-25T15:39:58.277 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/ns1/test1 2026-03-25T15:39:58.310 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-25 15:40:00 rbd2/ns1/test1 2026-03-25T15:39:58.310 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule add 1h 00:15 2026-03-25T15:39:58.353 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror snapshot schedule ls 2026-03-25T15:39:58.388 INFO:tasks.workunit.client.0.vm04.stderr:+ test 'every 1h starting at 00:15:00' = 'every 1h starting at 00:15:00' 2026-03-25T15:39:58.388 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls -R 2026-03-25T15:39:58.388 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 1h starting at 00:15:00' 2026-03-25T15:39:58.616 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:39:58.614+0000 7f2b214ba640 0 --2- 192.168.123.104:0/1204674453 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x5561a4b114d0 0x5561a4be5020 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 2026-03-25T15:39:58.626 INFO:tasks.workunit.client.0.vm04.stdout:- - - every 1h starting at 00:15:00 2026-03-25T15:39:58.626 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls -R 2026-03-25T15:39:58.626 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'rbd2 *ns1 *test1 *every 1m' 2026-03-25T15:39:58.660 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 ns1 test1 every 1m 2026-03-25T15:39:58.660 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd mirror snapshot schedule ls -p rbd2 2026-03-25T15:39:58.660 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls -p rbd2 2026-03-25T15:39:58.696 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:39:58.696 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls -p rbd2 -R 2026-03-25T15:39:58.696 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 1h starting at 00:15:00' 2026-03-25T15:39:58.731 INFO:tasks.workunit.client.0.vm04.stdout:- - - every 1h starting at 00:15:00 2026-03-25T15:39:58.731 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls -p rbd2 -R 2026-03-25T15:39:58.731 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'rbd2 *ns1 *test1 *every 1m' 2026-03-25T15:39:58.772 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 ns1 test1 every 1m 2026-03-25T15:39:58.773 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd mirror snapshot schedule ls -p rbd2/ns1 2026-03-25T15:39:58.773 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls -p rbd2/ns1 2026-03-25T15:39:58.811 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:39:58.811 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls -p rbd2/ns1 -R 2026-03-25T15:39:58.811 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 1h starting at 00:15:00' 2026-03-25T15:39:58.854 INFO:tasks.workunit.client.0.vm04.stdout:- - - every 1h starting at 00:15:00 2026-03-25T15:39:58.855 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls -p rbd2/ns1 -R 2026-03-25T15:39:58.855 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'rbd2 *ns1 *test1 *every 1m' 2026-03-25T15:39:58.890 INFO:tasks.workunit.client.0.vm04.stdout:rbd2 ns1 test1 every 1m 2026-03-25T15:39:58.891 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror snapshot schedule ls -p rbd2/ns1 --image test1 2026-03-25T15:39:58.935 INFO:tasks.workunit.client.0.vm04.stderr:+ test 'every 1m' = 'every 1m' 2026-03-25T15:39:58.935 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd mirror snapshot schedule add dummy 2026-03-25T15:39:58.935 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule add dummy 2026-03-25T15:39:58.967 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-25T15:39:58.965+0000 7fea55b89640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-25T15:39:58.967 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd mirror snapshot schedule add failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-25T15:39:58.970 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:39:58.970 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd mirror snapshot schedule add 1h dummy 2026-03-25T15:39:58.970 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule add 1h dummy 2026-03-25T15:39:59.000 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-25T15:39:58.999+0000 7fea55b89640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-25T15:39:59.000 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd mirror snapshot schedule add failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-25T15:39:59.004 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:39:59.004 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd mirror snapshot schedule add -p rbd2/ns1 --image test1 dummy 2026-03-25T15:39:59.004 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule add -p rbd2/ns1 --image test1 dummy 2026-03-25T15:39:59.047 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-25T15:39:59.045+0000 7fea55b89640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-25T15:39:59.047 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd mirror snapshot schedule add failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-25T15:39:59.051 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:39:59.051 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd mirror snapshot schedule add -p rbd2/ns1 --image test1 1h dummy 2026-03-25T15:39:59.051 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule add -p rbd2/ns1 --image test1 1h dummy 2026-03-25T15:39:59.096 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-25T15:39:59.094+0000 7fea55b89640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-25T15:39:59.096 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd mirror snapshot schedule add failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-25T15:39:59.099 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:39:59.099 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd mirror snapshot schedule remove dummy 2026-03-25T15:39:59.099 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule remove dummy 2026-03-25T15:39:59.128 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-25T15:39:59.127+0000 7fea55b89640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-25T15:39:59.128 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd mirror snapshot schedule remove failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-25T15:39:59.131 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:39:59.131 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd mirror snapshot schedule remove 1h dummy 2026-03-25T15:39:59.131 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule remove 1h dummy 2026-03-25T15:39:59.160 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-25T15:39:59.159+0000 7fea55b89640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-25T15:39:59.161 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd mirror snapshot schedule remove failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-25T15:39:59.163 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:39:59.163 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd mirror snapshot schedule remove -p rbd2/ns1 --image test1 dummy 2026-03-25T15:39:59.163 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule remove -p rbd2/ns1 --image test1 dummy 2026-03-25T15:39:59.206 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-25T15:39:59.205+0000 7fea55b89640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-25T15:39:59.207 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd mirror snapshot schedule remove failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-25T15:39:59.211 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:39:59.211 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd mirror snapshot schedule remove -p rbd2/ns1 --image test1 1h dummy 2026-03-25T15:39:59.211 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule remove -p rbd2/ns1 --image test1 1h dummy 2026-03-25T15:39:59.255 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-25T15:39:59.253+0000 7fea55b89640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-25T15:39:59.255 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd mirror snapshot schedule remove failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-25T15:39:59.258 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:39:59.258 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror snapshot schedule ls 2026-03-25T15:39:59.288 INFO:tasks.workunit.client.0.vm04.stderr:+ test 'every 1h starting at 00:15:00' = 'every 1h starting at 00:15:00' 2026-03-25T15:39:59.288 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror snapshot schedule ls -p rbd2/ns1 --image test1 2026-03-25T15:39:59.327 INFO:tasks.workunit.client.0.vm04.stderr:+ test 'every 1m' = 'every 1m' 2026-03-25T15:39:59.327 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm rbd2/ns1/test1 2026-03-25T15:40:02.037 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:40:02.036+0000 7f8c66279640 0 -- 192.168.123.104:0/427049622 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x56540df71030 msgr2=0x56540df61fe0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:40:03.059 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:40:03.057+0000 7f8c66279640 0 -- 192.168.123.104:0/427049622 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7f8c44060620 msgr2=0x7f8c44080a00 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:40:03.088 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:40:03.096 INFO:tasks.workunit.client.0.vm04.stderr:++ seq 12 2026-03-25T15:40:03.097 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-25T15:40:03.097 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-25T15:40:03.097 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/ns1/test1 2026-03-25T15:40:03.325 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:40:03.323+0000 7fa87111e640 0 --2- 192.168.123.104:0/2854602326 >> v2:192.168.123.104:3300/0 conn(0x55b4227be5a0 0x55b42270d2b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 2026-03-25T15:40:03.340 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-25 15:41:00 rbd2/ns1/test1 2026-03-25T15:40:03.340 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:40:13.342 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-25T15:40:13.342 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-25T15:40:13.342 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/ns1/test1 2026-03-25T15:40:13.369 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-25 15:41:00 rbd2/ns1/test1 2026-03-25T15:40:13.370 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:40:23.371 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-25T15:40:23.371 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-25T15:40:23.371 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/ns1/test1 2026-03-25T15:40:23.398 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-25 15:41:00 rbd2/ns1/test1 2026-03-25T15:40:23.398 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:40:33.400 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-25T15:40:33.400 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-25T15:40:33.400 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/ns1/test1 2026-03-25T15:40:33.433 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-25 15:41:00 rbd2/ns1/test1 2026-03-25T15:40:33.433 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:40:43.435 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-25T15:40:43.435 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-25T15:40:43.435 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/ns1/test1 2026-03-25T15:40:43.474 INFO:tasks.workunit.client.0.vm04.stdout:2026-03-25 15:41:00 rbd2/ns1/test1 2026-03-25T15:40:43.474 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:40:53.475 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 12` 2026-03-25T15:40:53.475 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-25T15:40:53.475 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/ns1/test1 2026-03-25T15:40:53.509 INFO:tasks.workunit.client.0.vm04.stderr:+ break 2026-03-25T15:40:53.509 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule status 2026-03-25T15:40:53.509 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail grep rbd2/ns1/test1 2026-03-25T15:40:53.509 INFO:tasks.workunit.client.0.vm04.stderr:+ grep rbd2/ns1/test1 2026-03-25T15:40:53.537 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:40:53.538 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule remove 2026-03-25T15:40:53.574 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror snapshot schedule ls -R --format json 2026-03-25T15:40:53.605 INFO:tasks.workunit.client.0.vm04.stderr:+ test '[]' = '[]' 2026-03-25T15:40:53.605 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-25T15:40:53.605 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:40:53.694 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:40:53.788 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:40:53.890 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:40:53.983 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:40:54.082 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:40:54.176 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:40:54.273 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:40:54.364 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:40:54.458 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:40:54.550 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:40:54.647 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:40:54.745 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:40:54.836 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:40:54.933 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:40:55.027 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:40:55.117 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:40:55.213 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:40:55.307 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool rm rbd2 rbd2 --yes-i-really-really-mean-it 2026-03-25T15:40:55.641 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd2' does not exist 2026-03-25T15:40:55.651 INFO:tasks.workunit.client.0.vm04.stderr:+ test_mirror_snapshot_schedule_recovery 2026-03-25T15:40:55.651 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing recovery of mirror snapshot scheduler after module'\''s RADOS client is blocklisted...' 2026-03-25T15:40:55.651 INFO:tasks.workunit.client.0.vm04.stdout:testing recovery of mirror snapshot scheduler after module's RADOS client is blocklisted... 2026-03-25T15:40:55.652 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-25T15:40:55.652 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:40:55.743 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:40:55.840 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:40:55.936 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:40:56.029 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:40:56.328 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:40:56.418 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:40:56.504 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:40:56.599 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:40:56.691 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:40:56.781 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:40:56.873 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:40:56.962 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:40:57.054 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:40:57.150 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:40:57.240 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:40:57.331 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:40:57.428 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:40:57.520 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool create rbd3 8 2026-03-25T15:40:57.843 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd3' already exists 2026-03-25T15:40:57.856 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd pool init rbd3 2026-03-25T15:41:00.788 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd namespace create rbd3/ns1 2026-03-25T15:41:00.823 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror pool enable rbd3 image 2026-03-25T15:41:00.855 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror pool enable rbd3/ns1 image 2026-03-25T15:41:00.889 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror pool peer add rbd3 cluster1 2026-03-25T15:41:00.918 INFO:tasks.workunit.client.0.vm04.stdout:49ac4296-8cbb-4698-8814-45d8278bfbce 2026-03-25T15:41:00.921 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 -s 1 rbd3/ns1/test1 2026-03-25T15:41:01.149 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:41:01.147+0000 7fbf3d1f0640 0 --2- 192.168.123.104:0/530685368 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x55fbc0cf4180 0x55fbc0d80640 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 2026-03-25T15:41:01.162 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror image enable rbd3/ns1/test1 snapshot 2026-03-25T15:41:01.795 INFO:tasks.workunit.client.0.vm04.stdout:Mirroring enabled 2026-03-25T15:41:01.801 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror image status rbd3/ns1/test1 2026-03-25T15:41:01.801 INFO:tasks.workunit.client.0.vm04.stderr:++ grep -c mirror.primary 2026-03-25T15:41:01.839 INFO:tasks.workunit.client.0.vm04.stderr:+ test 1 = 1 2026-03-25T15:41:01.839 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule add -p rbd3/ns1 --image test1 1m 2026-03-25T15:41:01.880 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror snapshot schedule ls -p rbd3/ns1 --image test1 2026-03-25T15:41:01.918 INFO:tasks.workunit.client.0.vm04.stderr:+ test 'every 1m' = 'every 1m' 2026-03-25T15:41:01.919 INFO:tasks.workunit.client.0.vm04.stderr:++ ceph mgr dump 2026-03-25T15:41:01.919 INFO:tasks.workunit.client.0.vm04.stderr:++ jq '.active_clients[]' 2026-03-25T15:41:01.919 INFO:tasks.workunit.client.0.vm04.stderr:++ jq 'select(.name == "rbd_support")' 2026-03-25T15:41:01.919 INFO:tasks.workunit.client.0.vm04.stderr:++ jq -r '[.addrvec[0].addr, "/", .addrvec[0].nonce|tostring] | add' 2026-03-25T15:41:02.437 INFO:tasks.workunit.client.0.vm04.stderr:+ CLIENT_ADDR=192.168.123.104:0/3705412354 2026-03-25T15:41:02.437 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd blocklist add 192.168.123.104:0/3705412354 2026-03-25T15:41:03.788 INFO:tasks.workunit.client.0.vm04.stderr:blocklisting 192.168.123.104:0/3705412354 until 2026-03-25T16:41:02.856135+0000 (3600 sec) 2026-03-25T15:41:03.808 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd mirror snapshot schedule add -p rbd3/ns1 --image test1 2m 2026-03-25T15:41:03.808 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule add -p rbd3/ns1 --image test1 2m 2026-03-25T15:41:04.037 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:41:04.036+0000 7f713f908640 0 --2- 192.168.123.104:0/2717243650 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f712c002340 0x7f712c002710 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 2026-03-25T15:41:04.046 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-25T15:41:04.045+0000 7fea55b89640 -1 librbd::api::Namespace: list: error listing namespaces: (108) Cannot send after transport endpoint shutdown 2026-03-25T15:41:04.047 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-25T15:41:04.045+0000 7fea55b89640 -1 mgr.server reply reply (11) Resource temporarily unavailable [errno 108] RBD connection was shutdown (error listing namespaces) 2026-03-25T15:41:04.047 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd mirror snapshot schedule add failed: (11) Resource temporarily unavailable: [errno 108] RBD connection was shutdown (error listing namespaces) 2026-03-25T15:41:04.051 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:41:04.051 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:41:14.053 INFO:tasks.workunit.client.0.vm04.stderr:++ seq 24 2026-03-25T15:41:14.054 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 24` 2026-03-25T15:41:14.054 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule add -p rbd3/ns1 --image test1 2m 2026-03-25T15:41:14.083 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-25T15:41:14.082+0000 7fea55b89640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-25T15:41:14.083 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd mirror snapshot schedule add failed: (11) Resource temporarily unavailable: rbd_support module is not ready, try again 2026-03-25T15:41:14.086 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:41:24.087 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 24` 2026-03-25T15:41:24.088 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule add -p rbd3/ns1 --image test1 2m 2026-03-25T15:41:24.120 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-25T15:41:24.118+0000 7fea55b89640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-25T15:41:24.120 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd mirror snapshot schedule add failed: (11) Resource temporarily unavailable: rbd_support module is not ready, try again 2026-03-25T15:41:24.123 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:41:34.125 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 24` 2026-03-25T15:41:34.125 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule add -p rbd3/ns1 --image test1 2m 2026-03-25T15:41:34.153 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-25T15:41:34.152+0000 7fea55b89640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-25T15:41:34.153 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd mirror snapshot schedule add failed: (11) Resource temporarily unavailable: rbd_support module is not ready, try again 2026-03-25T15:41:34.156 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:41:44.157 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 24` 2026-03-25T15:41:44.157 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule add -p rbd3/ns1 --image test1 2m 2026-03-25T15:41:44.185 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-25T15:41:44.184+0000 7fea55b89640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-25T15:41:44.186 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd mirror snapshot schedule add failed: (11) Resource temporarily unavailable: rbd_support module is not ready, try again 2026-03-25T15:41:44.188 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:41:54.190 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 24` 2026-03-25T15:41:54.190 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule add -p rbd3/ns1 --image test1 2m 2026-03-25T15:41:54.226 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-25T15:41:54.224+0000 7fea55b89640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-25T15:41:54.226 INFO:tasks.workunit.client.0.vm04.stderr:rbd: rbd mirror snapshot schedule add failed: (11) Resource temporarily unavailable: rbd_support module is not ready, try again 2026-03-25T15:41:54.230 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:42:04.232 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 24` 2026-03-25T15:42:04.232 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule add -p rbd3/ns1 --image test1 2m 2026-03-25T15:42:04.288 INFO:tasks.workunit.client.0.vm04.stderr:+ break 2026-03-25T15:42:04.288 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls -p rbd3/ns1 --image test1 2026-03-25T15:42:04.288 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 2m' 2026-03-25T15:42:04.331 INFO:tasks.workunit.client.0.vm04.stdout:every 1m, every 2m 2026-03-25T15:42:04.331 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls -p rbd3/ns1 --image test1 2026-03-25T15:42:04.331 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 1m' 2026-03-25T15:42:04.376 INFO:tasks.workunit.client.0.vm04.stdout:every 1m, every 2m 2026-03-25T15:42:04.376 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule rm -p rbd3/ns1 --image test1 2m 2026-03-25T15:42:04.441 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule rm -p rbd3/ns1 --image test1 1m 2026-03-25T15:42:04.484 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls -p rbd3/ns1 --image test1 2026-03-25T15:42:04.484 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail grep 'every 2m' 2026-03-25T15:42:04.484 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 2m' 2026-03-25T15:42:04.526 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:42:04.526 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror snapshot schedule ls -p rbd3/ns1 --image test1 2026-03-25T15:42:04.526 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail grep 'every 1m' 2026-03-25T15:42:04.526 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'every 1m' 2026-03-25T15:42:04.566 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:42:04.566 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap purge rbd3/ns1/test1 2026-03-25T15:42:04.605 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd rm rbd3/ns1/test1 2026-03-25T15:42:06.610 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:42:06.609+0000 7f16ff5a8640 0 -- 192.168.123.104:0/2233988310 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x562fa1067cc0 msgr2=0x562fa10baa30 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:42:06.723 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:42:06.722+0000 7f16fe31f640 0 -- 192.168.123.104:0/2233988310 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x562fa10f4910 msgr2=0x562fa1114d90 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-25T15:42:06.908 INFO:tasks.workunit.client.0.vm04.stderr: Removing image: 100% complete...done. 2026-03-25T15:42:06.912 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool rm rbd3 rbd3 --yes-i-really-really-mean-it 2026-03-25T15:42:07.457 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd3' does not exist 2026-03-25T15:42:07.467 INFO:tasks.workunit.client.0.vm04.stderr:+ test_perf_image_iostat 2026-03-25T15:42:07.467 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing perf image iostat...' 2026-03-25T15:42:07.468 INFO:tasks.workunit.client.0.vm04.stdout:testing perf image iostat... 2026-03-25T15:42:07.468 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-25T15:42:07.468 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:42:07.578 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:42:07.720 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:42:07.904 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:42:08.088 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:42:08.280 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:42:08.489 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:42:08.653 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:42:08.760 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:42:08.890 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:42:09.082 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:42:09.220 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:42:09.360 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:42:09.543 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:42:09.739 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:42:09.852 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:42:09.970 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:42:10.123 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:42:10.304 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool create rbd1 8 2026-03-25T15:42:11.155 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd1' already exists 2026-03-25T15:42:11.170 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd pool init rbd1 2026-03-25T15:42:14.137 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd namespace create rbd1/ns 2026-03-25T15:42:14.175 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool create rbd2 8 2026-03-25T15:42:15.213 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd2' already exists 2026-03-25T15:42:15.229 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd pool init rbd2 2026-03-25T15:42:18.254 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd namespace create rbd2/ns 2026-03-25T15:42:18.494 INFO:tasks.workunit.client.0.vm04.stderr:+ IMAGE_SPECS=("test1" "rbd1/test2" "rbd1/ns/test3" "rbd2/test4" "rbd2/ns/test5") 2026-03-25T15:42:18.494 INFO:tasks.workunit.client.0.vm04.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-25T15:42:18.494 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 10G --rbd-default-data-pool '' test1 2026-03-25T15:42:18.558 INFO:tasks.workunit.client.0.vm04.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-25T15:42:18.558 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 10G --rbd-default-data-pool '' rbd1/test2 2026-03-25T15:42:18.602 INFO:tasks.workunit.client.0.vm04.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-25T15:42:18.602 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 10G --rbd-default-data-pool '' rbd1/ns/test3 2026-03-25T15:42:18.648 INFO:tasks.workunit.client.0.vm04.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-25T15:42:18.648 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 10G --rbd-default-data-pool '' rbd2/test4 2026-03-25T15:42:18.692 INFO:tasks.workunit.client.0.vm04.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-25T15:42:18.692 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 10G --rbd-default-data-pool '' rbd2/ns/test5 2026-03-25T15:42:18.733 INFO:tasks.workunit.client.0.vm04.stderr:+ BENCH_PIDS=() 2026-03-25T15:42:18.733 INFO:tasks.workunit.client.0.vm04.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-25T15:42:18.733 INFO:tasks.workunit.client.0.vm04.stderr:+ BENCH_PIDS+=($!) 2026-03-25T15:42:18.733 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-pattern rand --io-total 10G --io-threads 1 --rbd-cache false test1 2026-03-25T15:42:18.733 INFO:tasks.workunit.client.0.vm04.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-25T15:42:18.733 INFO:tasks.workunit.client.0.vm04.stderr:+ BENCH_PIDS+=($!) 2026-03-25T15:42:18.733 INFO:tasks.workunit.client.0.vm04.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-25T15:42:18.733 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-pattern rand --io-total 10G --io-threads 1 --rbd-cache false rbd1/test2 2026-03-25T15:42:18.733 INFO:tasks.workunit.client.0.vm04.stderr:+ BENCH_PIDS+=($!) 2026-03-25T15:42:18.733 INFO:tasks.workunit.client.0.vm04.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-25T15:42:18.733 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-pattern rand --io-total 10G --io-threads 1 --rbd-cache false rbd1/ns/test3 2026-03-25T15:42:18.733 INFO:tasks.workunit.client.0.vm04.stderr:+ BENCH_PIDS+=($!) 2026-03-25T15:42:18.734 INFO:tasks.workunit.client.0.vm04.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-25T15:42:18.734 INFO:tasks.workunit.client.0.vm04.stderr:+ BENCH_PIDS+=($!) 2026-03-25T15:42:18.734 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-pattern rand --io-total 10G --io-threads 1 --rbd-cache false rbd2/ns/test5 2026-03-25T15:42:18.738 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-pattern rand --io-total 10G --io-threads 1 --rbd-cache false rbd2/test4 2026-03-25T15:42:18.745 INFO:tasks.workunit.client.0.vm04.stderr:++ jq -r 'map(.image) | sort | join(" ")' 2026-03-25T15:42:18.747 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd perf image iostat --format json rbd1 2026-03-25T15:42:18.843 INFO:tasks.workunit.client.0.vm04.stderr:rbd: waiting for initial image stats 2026-03-25T15:42:18.843 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-25T15:42:28.853 INFO:tasks.workunit.client.0.vm04.stderr:+ test test2 = test2 2026-03-25T15:42:28.853 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd perf image iostat --format json rbd1/ns 2026-03-25T15:42:28.853 INFO:tasks.workunit.client.0.vm04.stderr:++ jq -r 'map(.image) | sort | join(" ")' 2026-03-25T15:42:28.896 INFO:tasks.workunit.client.0.vm04.stderr:rbd: waiting for initial image stats 2026-03-25T15:42:28.896 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-25T15:42:38.908 INFO:tasks.workunit.client.0.vm04.stderr:+ test test3 = test3 2026-03-25T15:42:38.910 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd perf image iostat --format json --rbd-default-pool rbd1 /ns 2026-03-25T15:42:38.910 INFO:tasks.workunit.client.0.vm04.stderr:++ jq -r 'map(.image) | sort | join(" ")' 2026-03-25T15:42:39.177 INFO:tasks.workunit.client.0.vm04.stderr:+ test test3 = test3 2026-03-25T15:42:39.177 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd perf image iostat --format json --pool rbd2 2026-03-25T15:42:39.177 INFO:tasks.workunit.client.0.vm04.stderr:++ jq -r 'map(.image) | sort | join(" ")' 2026-03-25T15:42:39.207 INFO:tasks.workunit.client.0.vm04.stderr:rbd: waiting for initial image stats 2026-03-25T15:42:39.207 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-25T15:42:49.218 INFO:tasks.workunit.client.0.vm04.stderr:+ test test4 = test4 2026-03-25T15:42:49.219 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd perf image iostat --format json --pool rbd2 --namespace ns 2026-03-25T15:42:49.219 INFO:tasks.workunit.client.0.vm04.stderr:++ jq -r 'map(.image) | sort | join(" ")' 2026-03-25T15:42:49.254 INFO:tasks.workunit.client.0.vm04.stderr:rbd: waiting for initial image stats 2026-03-25T15:42:49.254 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-25T15:42:59.262 INFO:tasks.workunit.client.0.vm04.stderr:+ test test5 = test5 2026-03-25T15:42:59.262 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd perf image iostat --format json --rbd-default-pool rbd2 --namespace ns 2026-03-25T15:42:59.262 INFO:tasks.workunit.client.0.vm04.stderr:++ jq -r 'map(.image) | sort | join(" ")' 2026-03-25T15:42:59.324 INFO:tasks.workunit.client.0.vm04.stderr:+ test test5 = test5 2026-03-25T15:42:59.324 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd perf image iostat --format json 2026-03-25T15:42:59.327 INFO:tasks.workunit.client.0.vm04.stderr:++ jq -r 'map(.image) | sort | join(" ")' 2026-03-25T15:42:59.361 INFO:tasks.workunit.client.0.vm04.stderr:rbd: waiting for initial image stats 2026-03-25T15:42:59.362 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-25T15:43:11.945 INFO:tasks.ceph.osd.1.vm04.stderr:2026-03-25T15:43:11.944+0000 7fbe6a037640 -1 reset not still connected to 0x555c479f3450 2026-03-25T15:43:14.374 INFO:tasks.workunit.client.0.vm04.stderr:+ test 'test1 test2 test3 test4 test5' = 'test1 test2 test3 test4 test5' 2026-03-25T15:43:14.374 INFO:tasks.workunit.client.0.vm04.stderr:+ for pid in "${BENCH_PIDS[@]}" 2026-03-25T15:43:14.374 INFO:tasks.workunit.client.0.vm04.stderr:+ kill 115939 2026-03-25T15:43:14.374 INFO:tasks.workunit.client.0.vm04.stderr:+ for pid in "${BENCH_PIDS[@]}" 2026-03-25T15:43:14.374 INFO:tasks.workunit.client.0.vm04.stderr:+ kill 115940 2026-03-25T15:43:14.374 INFO:tasks.workunit.client.0.vm04.stderr:+ for pid in "${BENCH_PIDS[@]}" 2026-03-25T15:43:14.374 INFO:tasks.workunit.client.0.vm04.stderr:+ kill 115941 2026-03-25T15:43:14.374 INFO:tasks.workunit.client.0.vm04.stderr:+ for pid in "${BENCH_PIDS[@]}" 2026-03-25T15:43:14.374 INFO:tasks.workunit.client.0.vm04.stderr:+ kill 115942 2026-03-25T15:43:14.374 INFO:tasks.workunit.client.0.vm04.stderr:+ for pid in "${BENCH_PIDS[@]}" 2026-03-25T15:43:14.374 INFO:tasks.workunit.client.0.vm04.stderr:+ kill 115943 2026-03-25T15:43:14.374 INFO:tasks.workunit.client.0.vm04.stderr:+ wait 2026-03-25T15:43:14.480 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-25T15:43:14.480 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:43:14.592 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:43:14.699 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:43:14.803 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:43:14.922 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:43:15.033 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:43:15.137 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:43:15.237 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:43:15.339 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:43:15.449 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:43:15.557 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:43:15.668 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:43:15.773 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:43:15.890 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:43:20.334 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:43:20.446 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:43:20.551 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:43:20.667 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:43:20.884 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool rm rbd2 rbd2 --yes-i-really-really-mean-it 2026-03-25T15:43:21.274 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd2' does not exist 2026-03-25T15:43:21.292 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool rm rbd1 rbd1 --yes-i-really-really-mean-it 2026-03-25T15:43:22.760 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd1' does not exist 2026-03-25T15:43:22.771 INFO:tasks.workunit.client.0.vm04.stderr:+ test_perf_image_iostat_recovery 2026-03-25T15:43:22.771 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing recovery of perf handler after module'\''s RADOS client is blocklisted...' 2026-03-25T15:43:22.771 INFO:tasks.workunit.client.0.vm04.stdout:testing recovery of perf handler after module's RADOS client is blocklisted... 2026-03-25T15:43:22.772 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-25T15:43:22.772 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:43:22.960 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:43:23.084 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:43:23.192 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:43:23.319 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:43:23.427 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:43:23.541 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:43:23.653 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:43:23.770 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:43:23.963 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:43:24.082 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:43:24.204 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:43:24.349 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:43:24.475 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:43:24.592 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:43:24.714 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:43:24.823 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:43:24.934 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:43:25.060 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool create rbd3 8 2026-03-25T15:43:25.764 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd3' already exists 2026-03-25T15:43:25.774 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd pool init rbd3 2026-03-25T15:43:28.732 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd namespace create rbd3/ns 2026-03-25T15:43:28.766 INFO:tasks.workunit.client.0.vm04.stderr:+ IMAGE_SPECS=("rbd3/test1" "rbd3/ns/test2") 2026-03-25T15:43:28.766 INFO:tasks.workunit.client.0.vm04.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-25T15:43:28.766 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 10G --rbd-default-data-pool '' rbd3/test1 2026-03-25T15:43:28.806 INFO:tasks.workunit.client.0.vm04.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-25T15:43:28.806 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 10G --rbd-default-data-pool '' rbd3/ns/test2 2026-03-25T15:43:28.844 INFO:tasks.workunit.client.0.vm04.stderr:+ BENCH_PIDS=() 2026-03-25T15:43:28.844 INFO:tasks.workunit.client.0.vm04.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-25T15:43:28.844 INFO:tasks.workunit.client.0.vm04.stderr:+ BENCH_PIDS+=($!) 2026-03-25T15:43:28.844 INFO:tasks.workunit.client.0.vm04.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-25T15:43:28.844 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-pattern rand --io-total 10G --io-threads 1 --rbd-cache false rbd3/test1 2026-03-25T15:43:28.844 INFO:tasks.workunit.client.0.vm04.stderr:+ BENCH_PIDS+=($!) 2026-03-25T15:43:28.844 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-pattern rand --io-total 10G --io-threads 1 --rbd-cache false rbd3/ns/test2 2026-03-25T15:43:28.845 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd perf image iostat --format json rbd3 2026-03-25T15:43:28.845 INFO:tasks.workunit.client.0.vm04.stderr:++ jq -r 'map(.image) | sort | join(" ")' 2026-03-25T15:43:28.883 INFO:tasks.workunit.client.0.vm04.stderr:rbd: waiting for initial image stats 2026-03-25T15:43:28.883 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-25T15:43:38.890 INFO:tasks.workunit.client.0.vm04.stderr:+ test test1 = test1 2026-03-25T15:43:38.890 INFO:tasks.workunit.client.0.vm04.stderr:++ ceph mgr dump 2026-03-25T15:43:38.911 INFO:tasks.workunit.client.0.vm04.stderr:++ jq 'select(.name == "rbd_support")' 2026-03-25T15:43:38.911 INFO:tasks.workunit.client.0.vm04.stderr:++ jq -r '[.addrvec[0].addr, "/", .addrvec[0].nonce|tostring] | add' 2026-03-25T15:43:38.911 INFO:tasks.workunit.client.0.vm04.stderr:++ jq '.active_clients[]' 2026-03-25T15:43:39.341 INFO:tasks.workunit.client.0.vm04.stderr:+ CLIENT_ADDR=192.168.123.104:0/1093454813 2026-03-25T15:43:39.341 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd blocklist add 192.168.123.104:0/1093454813 2026-03-25T15:43:41.043 INFO:tasks.workunit.client.0.vm04.stderr:blocklisting 192.168.123.104:0/1093454813 until 2026-03-25T16:43:40.123543+0000 (3600 sec) 2026-03-25T15:43:41.056 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd perf image iostat --format json rbd3/ns 2026-03-25T15:43:41.056 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd perf image iostat --format json rbd3/ns 2026-03-25T15:43:41.093 INFO:tasks.workunit.client.0.vm04.stderr:rbd: waiting for initial image stats 2026-03-25T15:43:41.094 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-25T15:43:46.094 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-25T15:43:46.092+0000 7fea51380640 -1 librbd::api::Image: list_images_v2: error listing image in directory: (108) Cannot send after transport endpoint shutdown 2026-03-25T15:43:46.094 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-25T15:43:46.092+0000 7fea51380640 -1 librbd::api::Image: list_images: error listing v2 images: (108) Cannot send after transport endpoint shutdown 2026-03-25T15:43:46.094 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-25T15:43:46.093+0000 7fea55b89640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-25T15:43:46.094 INFO:tasks.workunit.client.0.vm04.stderr:rbd: mgr command failed: (11) Resource temporarily unavailable: rbd_support module is not ready, try again 2026-03-25T15:43:46.098 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:43:46.098 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:43:56.100 INFO:tasks.workunit.client.0.vm04.stderr:++ seq 24 2026-03-25T15:43:56.102 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 24` 2026-03-25T15:43:56.105 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd perf image iostat --format json rbd3/ns 2026-03-25T15:43:56.105 INFO:tasks.workunit.client.0.vm04.stderr:++ jq -r 'map(.image) | sort | join(" ")' 2026-03-25T15:43:56.151 INFO:tasks.workunit.client.0.vm04.stderr:rbd: waiting for initial image stats 2026-03-25T15:43:56.151 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-25T15:44:06.158 INFO:tasks.workunit.client.0.vm04.stderr:+ test test2 = test2 2026-03-25T15:44:06.158 INFO:tasks.workunit.client.0.vm04.stderr:+ break 2026-03-25T15:44:06.158 INFO:tasks.workunit.client.0.vm04.stderr:+ for pid in "${BENCH_PIDS[@]}" 2026-03-25T15:44:06.158 INFO:tasks.workunit.client.0.vm04.stderr:+ kill 117868 2026-03-25T15:44:06.158 INFO:tasks.workunit.client.0.vm04.stderr:+ for pid in "${BENCH_PIDS[@]}" 2026-03-25T15:44:06.158 INFO:tasks.workunit.client.0.vm04.stderr:+ kill 117869 2026-03-25T15:44:06.158 INFO:tasks.workunit.client.0.vm04.stderr:+ wait 2026-03-25T15:44:06.209 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-25T15:44:06.210 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:06.312 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:06.415 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:06.514 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:06.618 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:06.720 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:06.821 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:06.918 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:07.220 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:07.326 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:07.439 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:07.537 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:07.640 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:07.742 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:07.847 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:07.964 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:08.071 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:08.171 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:08.267 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool rm rbd3 rbd3 --yes-i-really-really-mean-it 2026-03-25T15:44:09.173 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd3' does not exist 2026-03-25T15:44:09.187 INFO:tasks.workunit.client.0.vm04.stderr:+ test_mirror_pool_peer_bootstrap_create 2026-03-25T15:44:09.187 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing mirror pool peer bootstrap create...' 2026-03-25T15:44:09.187 INFO:tasks.workunit.client.0.vm04.stdout:testing mirror pool peer bootstrap create... 2026-03-25T15:44:09.187 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-25T15:44:09.187 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:09.363 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:09.518 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:09.619 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:09.717 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:09.818 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:09.934 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:10.035 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:10.139 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:10.246 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:10.346 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:10.449 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:10.548 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:10.650 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:10.747 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:10.840 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:10.938 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:11.047 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:11.149 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool create rbd1 8 2026-03-25T15:44:11.566 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd1' already exists 2026-03-25T15:44:11.577 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd pool init rbd1 2026-03-25T15:44:14.524 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror pool enable rbd1 image 2026-03-25T15:44:14.559 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool create rbd2 8 2026-03-25T15:44:15.590 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd2' already exists 2026-03-25T15:44:15.600 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd pool init rbd2 2026-03-25T15:44:18.539 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd mirror pool enable rbd2 pool 2026-03-25T15:44:18.579 INFO:tasks.workunit.client.0.vm04.stderr:+ readarray -t MON_ADDRS 2026-03-25T15:44:18.579 INFO:tasks.workunit.client.0.vm04.stderr:++ ceph mon dump 2026-03-25T15:44:18.579 INFO:tasks.workunit.client.0.vm04.stderr:++ sed -n 's/^[0-9]: \(.*\) mon\.[a-z]$/\1/p' 2026-03-25T15:44:18.937 INFO:tasks.workunit.client.0.vm04.stderr:dumped monmap epoch 1 2026-03-25T15:44:18.949 INFO:tasks.workunit.client.0.vm04.stderr:+ BAD_MON_ADDR=1.2.3.4:6789 2026-03-25T15:44:18.949 INFO:tasks.workunit.client.0.vm04.stderr:+ MON_HOST='[v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0],1.2.3.4:6789' 2026-03-25T15:44:18.950 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror pool peer bootstrap create --mon-host '[v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0],1.2.3.4:6789' rbd1 2026-03-25T15:44:18.950 INFO:tasks.workunit.client.0.vm04.stderr:++ base64 -d 2026-03-25T15:44:18.986 INFO:tasks.workunit.client.0.vm04.stderr:+ TOKEN='{"fsid":"08196b8a-fd91-49b2-b8a6-e1d21f829086","client_id":"rbd-mirror-peer","key":"AQDSAsRpklx3OhAAONzUk/lYfQK5uhoHnbPomQ==","mon_host":"[v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0]"}' 2026-03-25T15:44:18.986 INFO:tasks.workunit.client.0.vm04.stderr:++ jq -r .fsid 2026-03-25T15:44:18.988 INFO:tasks.workunit.client.0.vm04.stderr:+ TOKEN_FSID=08196b8a-fd91-49b2-b8a6-e1d21f829086 2026-03-25T15:44:18.989 INFO:tasks.workunit.client.0.vm04.stderr:++ jq -r .client_id 2026-03-25T15:44:18.990 INFO:tasks.workunit.client.0.vm04.stderr:+ TOKEN_CLIENT_ID=rbd-mirror-peer 2026-03-25T15:44:18.991 INFO:tasks.workunit.client.0.vm04.stderr:++ jq -r .key 2026-03-25T15:44:18.993 INFO:tasks.workunit.client.0.vm04.stderr:+ TOKEN_KEY=AQDSAsRpklx3OhAAONzUk/lYfQK5uhoHnbPomQ== 2026-03-25T15:44:18.993 INFO:tasks.workunit.client.0.vm04.stderr:++ jq -r .mon_host 2026-03-25T15:44:18.996 INFO:tasks.workunit.client.0.vm04.stderr:+ TOKEN_MON_HOST='[v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0]' 2026-03-25T15:44:18.996 INFO:tasks.workunit.client.0.vm04.stderr:++ ceph fsid 2026-03-25T15:44:19.356 INFO:tasks.workunit.client.0.vm04.stderr:+ test 08196b8a-fd91-49b2-b8a6-e1d21f829086 = 08196b8a-fd91-49b2-b8a6-e1d21f829086 2026-03-25T15:44:19.356 INFO:tasks.workunit.client.0.vm04.stderr:++ ceph auth get-key client.rbd-mirror-peer 2026-03-25T15:44:19.719 INFO:tasks.workunit.client.0.vm04.stderr:+ test AQDSAsRpklx3OhAAONzUk/lYfQK5uhoHnbPomQ== = AQDSAsRpklx3OhAAONzUk/lYfQK5uhoHnbPomQ== 2026-03-25T15:44:19.720 INFO:tasks.workunit.client.0.vm04.stderr:+ for addr in "${MON_ADDRS[@]}" 2026-03-25T15:44:19.720 INFO:tasks.workunit.client.0.vm04.stderr:+ fgrep '[v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0]' 2026-03-25T15:44:19.721 INFO:tasks.workunit.client.0.vm04.stdout:[v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] 2026-03-25T15:44:19.721 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail fgrep 1.2.3.4:6789 2026-03-25T15:44:19.722 INFO:tasks.workunit.client.0.vm04.stderr:+ fgrep 1.2.3.4:6789 2026-03-25T15:44:19.723 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:44:19.724 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror pool peer bootstrap create --mon-host '[v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0],1.2.3.4:6789' rbd1 2026-03-25T15:44:19.724 INFO:tasks.workunit.client.0.vm04.stderr:++ base64 -d 2026-03-25T15:44:19.757 INFO:tasks.workunit.client.0.vm04.stderr:+ test '{"fsid":"08196b8a-fd91-49b2-b8a6-e1d21f829086","client_id":"rbd-mirror-peer","key":"AQDSAsRpklx3OhAAONzUk/lYfQK5uhoHnbPomQ==","mon_host":"[v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0]"}' = '{"fsid":"08196b8a-fd91-49b2-b8a6-e1d21f829086","client_id":"rbd-mirror-peer","key":"AQDSAsRpklx3OhAAONzUk/lYfQK5uhoHnbPomQ==","mon_host":"[v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0]"}' 2026-03-25T15:44:19.758 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror pool peer bootstrap create rbd1 2026-03-25T15:44:19.758 INFO:tasks.workunit.client.0.vm04.stderr:++ base64 -d 2026-03-25T15:44:19.793 INFO:tasks.workunit.client.0.vm04.stderr:+ test '{"fsid":"08196b8a-fd91-49b2-b8a6-e1d21f829086","client_id":"rbd-mirror-peer","key":"AQDSAsRpklx3OhAAONzUk/lYfQK5uhoHnbPomQ==","mon_host":"[v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0]"}' = '{"fsid":"08196b8a-fd91-49b2-b8a6-e1d21f829086","client_id":"rbd-mirror-peer","key":"AQDSAsRpklx3OhAAONzUk/lYfQK5uhoHnbPomQ==","mon_host":"[v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0]"}' 2026-03-25T15:44:19.793 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror pool peer bootstrap create --mon-host '[v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0],1.2.3.4:6789' rbd2 2026-03-25T15:44:19.793 INFO:tasks.workunit.client.0.vm04.stderr:++ base64 -d 2026-03-25T15:44:19.826 INFO:tasks.workunit.client.0.vm04.stderr:+ test '{"fsid":"08196b8a-fd91-49b2-b8a6-e1d21f829086","client_id":"rbd-mirror-peer","key":"AQDSAsRpklx3OhAAONzUk/lYfQK5uhoHnbPomQ==","mon_host":"[v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0]"}' = '{"fsid":"08196b8a-fd91-49b2-b8a6-e1d21f829086","client_id":"rbd-mirror-peer","key":"AQDSAsRpklx3OhAAONzUk/lYfQK5uhoHnbPomQ==","mon_host":"[v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0]"}' 2026-03-25T15:44:19.827 INFO:tasks.workunit.client.0.vm04.stderr:++ rbd mirror pool peer bootstrap create rbd2 2026-03-25T15:44:19.827 INFO:tasks.workunit.client.0.vm04.stderr:++ base64 -d 2026-03-25T15:44:19.860 INFO:tasks.workunit.client.0.vm04.stderr:+ test '{"fsid":"08196b8a-fd91-49b2-b8a6-e1d21f829086","client_id":"rbd-mirror-peer","key":"AQDSAsRpklx3OhAAONzUk/lYfQK5uhoHnbPomQ==","mon_host":"[v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0]"}' = '{"fsid":"08196b8a-fd91-49b2-b8a6-e1d21f829086","client_id":"rbd-mirror-peer","key":"AQDSAsRpklx3OhAAONzUk/lYfQK5uhoHnbPomQ==","mon_host":"[v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0]"}' 2026-03-25T15:44:19.860 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool rm rbd2 rbd2 --yes-i-really-really-mean-it 2026-03-25T15:44:20.611 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd2' does not exist 2026-03-25T15:44:20.623 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool rm rbd1 rbd1 --yes-i-really-really-mean-it 2026-03-25T15:44:21.607 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd1' does not exist 2026-03-25T15:44:21.621 INFO:tasks.workunit.client.0.vm04.stderr:+ test_tasks_removed_pool 2026-03-25T15:44:21.621 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing removing pool under running tasks...' 2026-03-25T15:44:21.621 INFO:tasks.workunit.client.0.vm04.stdout:testing removing pool under running tasks... 2026-03-25T15:44:21.621 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-25T15:44:21.621 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:21.727 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:21.831 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:21.936 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:22.042 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:22.146 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:22.246 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:22.357 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:22.460 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:22.677 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:22.840 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:22.954 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:23.057 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:23.155 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:23.256 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:23.366 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:23.470 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:23.567 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:23.665 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool create rbd2 8 2026-03-25T15:44:23.979 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd2' already exists 2026-03-25T15:44:23.990 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd pool init rbd2 2026-03-25T15:44:26.930 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 1G foo 2026-03-25T15:44:26.977 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create foo@snap 2026-03-25T15:44:27.943 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-25T15:44:27.955 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap protect foo@snap 2026-03-25T15:44:28.177 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:44:28.177+0000 7f4ab98c5640 0 --2- 192.168.123.104:0/1609710191 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x56360f863f50 0x56360f900a20 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 2026-03-25T15:44:28.197 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone foo@snap bar 2026-03-25T15:44:28.269 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 1G rbd2/dummy 2026-03-25T15:44:28.307 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-pattern seq --io-size 1M --io-total 1G rbd2/dummy 2026-03-25T15:44:28.343 INFO:tasks.workunit.client.0.vm04.stdout:bench type write io_size 1048576 io_threads 16 bytes 1073741824 pattern sequential 2026-03-25T15:44:29.378 INFO:tasks.workunit.client.0.vm04.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-25T15:44:29.378 INFO:tasks.workunit.client.0.vm04.stdout: 1 304 337.909 338 MiB/s 2026-03-25T15:44:30.231 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:44:30.230+0000 7fee27fff640 0 -- 192.168.123.104:0/167171463 >> [v2:192.168.123.104:6808/2830815940,v1:192.168.123.104:6809/2830815940] conn(0x7fee040343d0 msgr2=0x7fee04034840 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:44:30.388 INFO:tasks.workunit.client.0.vm04.stdout: 2 560 294.177 294 MiB/s 2026-03-25T15:44:31.358 INFO:tasks.workunit.client.0.vm04.stdout: 3 816 284.25 284 MiB/s 2026-03-25T15:44:31.468 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:44:31.468+0000 7fee2cd4d640 0 -- 192.168.123.104:0/167171463 >> [v2:192.168.123.104:6816/2768043570,v1:192.168.123.104:6817/2768043570] conn(0x7fee040343d0 msgr2=0x7fee0c0c1e20 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:44:32.026 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:44:32.025+0000 7fee2dfd6640 0 -- 192.168.123.104:0/167171463 >> [v2:192.168.123.104:6800/2246359187,v1:192.168.123.104:6801/2246359187] conn(0x564352970370 msgr2=0x5643529fcb60 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-25T15:44:32.271 INFO:tasks.workunit.client.0.vm04.stdout:elapsed: 3 ops: 1024 ops/sec: 260.692 bytes/sec: 261 MiB/s 2026-03-25T15:44:32.280 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create rbd2/dummy@snap 2026-03-25T15:44:33.117 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-25T15:44:33.124 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap protect rbd2/dummy@snap 2026-03-25T15:44:33.158 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in {1..5} 2026-03-25T15:44:33.158 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone rbd2/dummy@snap rbd2/dummy1 2026-03-25T15:44:33.207 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in {1..5} 2026-03-25T15:44:33.207 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone rbd2/dummy@snap rbd2/dummy2 2026-03-25T15:44:33.258 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in {1..5} 2026-03-25T15:44:33.258 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone rbd2/dummy@snap rbd2/dummy3 2026-03-25T15:44:33.308 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in {1..5} 2026-03-25T15:44:33.308 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone rbd2/dummy@snap rbd2/dummy4 2026-03-25T15:44:33.355 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in {1..5} 2026-03-25T15:44:33.355 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone rbd2/dummy@snap rbd2/dummy5 2026-03-25T15:44:33.402 INFO:tasks.workunit.client.0.vm04.stderr:++ ceph rbd task list 2026-03-25T15:44:33.731 INFO:tasks.workunit.client.0.vm04.stderr:+ test '[]' = '[]' 2026-03-25T15:44:33.731 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in {1..5} 2026-03-25T15:44:33.731 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph rbd task add flatten rbd2/dummy1 2026-03-25T15:44:34.354 INFO:tasks.workunit.client.0.vm04.stdout:{"sequence": 1, "id": "dca29f9d-0a92-4bdc-b4e9-a1ebe5d05157", "message": "Flattening image rbd2/dummy1", "refs": {"action": "flatten", "pool_name": "rbd2", "pool_namespace": "", "image_name": "dummy1", "image_id": "31b8a16d9f0f"}, "in_progress": true, "progress": 0.05078125} 2026-03-25T15:44:34.378 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in {1..5} 2026-03-25T15:44:34.379 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph rbd task add flatten rbd2/dummy2 2026-03-25T15:44:35.180 INFO:tasks.workunit.client.0.vm04.stdout:{"sequence": 2, "id": "2d7122f0-e1a7-4df3-b709-296ffb6c0354", "message": "Flattening image rbd2/dummy2", "refs": {"action": "flatten", "pool_name": "rbd2", "pool_namespace": "", "image_name": "dummy2", "image_id": "31bbb3b6461e"}} 2026-03-25T15:44:35.196 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in {1..5} 2026-03-25T15:44:35.196 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph rbd task add flatten rbd2/dummy3 2026-03-25T15:44:35.935 INFO:tasks.workunit.client.0.vm04.stdout:{"sequence": 3, "id": "2fee88c5-6e9b-44b9-a701-b77ecf1e9e01", "message": "Flattening image rbd2/dummy3", "refs": {"action": "flatten", "pool_name": "rbd2", "pool_namespace": "", "image_name": "dummy3", "image_id": "31be72b2f727"}} 2026-03-25T15:44:35.953 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in {1..5} 2026-03-25T15:44:35.953 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph rbd task add flatten rbd2/dummy4 2026-03-25T15:44:37.266 INFO:tasks.workunit.client.0.vm04.stdout:{"sequence": 4, "id": "18d66043-8dbc-401f-852d-87ab45a1b1d5", "message": "Flattening image rbd2/dummy4", "refs": {"action": "flatten", "pool_name": "rbd2", "pool_namespace": "", "image_name": "dummy4", "image_id": "31c14e595a16"}} 2026-03-25T15:44:37.286 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in {1..5} 2026-03-25T15:44:37.286 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph rbd task add flatten rbd2/dummy5 2026-03-25T15:44:38.134 INFO:tasks.workunit.client.0.vm04.stdout:{"sequence": 5, "id": "facc41f9-9bf3-4613-8610-04fabb1341da", "message": "Flattening image rbd2/dummy5", "refs": {"action": "flatten", "pool_name": "rbd2", "pool_namespace": "", "image_name": "dummy5", "image_id": "31c476906855"}} 2026-03-25T15:44:38.148 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool delete rbd2 rbd2 --yes-i-really-really-mean-it 2026-03-25T15:44:38.753 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd2' does not exist 2026-03-25T15:44:38.765 INFO:tasks.workunit.client.0.vm04.stderr:++ ceph rbd task list 2026-03-25T15:44:39.092 INFO:tasks.workunit.client.0.vm04.stderr:+ test '[ 2026-03-25T15:44:39.092 INFO:tasks.workunit.client.0.vm04.stderr: { 2026-03-25T15:44:39.092 INFO:tasks.workunit.client.0.vm04.stderr: "id": "2d7122f0-e1a7-4df3-b709-296ffb6c0354", 2026-03-25T15:44:39.092 INFO:tasks.workunit.client.0.vm04.stderr: "message": "Flattening image rbd2/dummy2", 2026-03-25T15:44:39.092 INFO:tasks.workunit.client.0.vm04.stderr: "refs": { 2026-03-25T15:44:39.092 INFO:tasks.workunit.client.0.vm04.stderr: "action": "flatten", 2026-03-25T15:44:39.092 INFO:tasks.workunit.client.0.vm04.stderr: "image_id": "31bbb3b6461e", 2026-03-25T15:44:39.092 INFO:tasks.workunit.client.0.vm04.stderr: "image_name": "dummy2", 2026-03-25T15:44:39.092 INFO:tasks.workunit.client.0.vm04.stderr: "pool_name": "rbd2", 2026-03-25T15:44:39.092 INFO:tasks.workunit.client.0.vm04.stderr: "pool_namespace": "" 2026-03-25T15:44:39.092 INFO:tasks.workunit.client.0.vm04.stderr: }, 2026-03-25T15:44:39.092 INFO:tasks.workunit.client.0.vm04.stderr: "sequence": 2 2026-03-25T15:44:39.092 INFO:tasks.workunit.client.0.vm04.stderr: }, 2026-03-25T15:44:39.092 INFO:tasks.workunit.client.0.vm04.stderr: { 2026-03-25T15:44:39.092 INFO:tasks.workunit.client.0.vm04.stderr: "id": "2fee88c5-6e9b-44b9-a701-b77ecf1e9e01", 2026-03-25T15:44:39.092 INFO:tasks.workunit.client.0.vm04.stderr: "message": "Flattening image rbd2/dummy3", 2026-03-25T15:44:39.092 INFO:tasks.workunit.client.0.vm04.stderr: "refs": { 2026-03-25T15:44:39.092 INFO:tasks.workunit.client.0.vm04.stderr: "action": "flatten", 2026-03-25T15:44:39.092 INFO:tasks.workunit.client.0.vm04.stderr: "image_id": "31be72b2f727", 2026-03-25T15:44:39.092 INFO:tasks.workunit.client.0.vm04.stderr: "image_name": "dummy3", 2026-03-25T15:44:39.092 INFO:tasks.workunit.client.0.vm04.stderr: "pool_name": "rbd2", 2026-03-25T15:44:39.092 INFO:tasks.workunit.client.0.vm04.stderr: "pool_namespace": "" 2026-03-25T15:44:39.092 INFO:tasks.workunit.client.0.vm04.stderr: }, 2026-03-25T15:44:39.092 INFO:tasks.workunit.client.0.vm04.stderr: "sequence": 3 2026-03-25T15:44:39.092 INFO:tasks.workunit.client.0.vm04.stderr: }, 2026-03-25T15:44:39.092 INFO:tasks.workunit.client.0.vm04.stderr: { 2026-03-25T15:44:39.092 INFO:tasks.workunit.client.0.vm04.stderr: "id": "18d66043-8dbc-401f-852d-87ab45a1b1d5", 2026-03-25T15:44:39.092 INFO:tasks.workunit.client.0.vm04.stderr: "message": "Flattening image rbd2/dummy4", 2026-03-25T15:44:39.092 INFO:tasks.workunit.client.0.vm04.stderr: "refs": { 2026-03-25T15:44:39.092 INFO:tasks.workunit.client.0.vm04.stderr: "action": "flatten", 2026-03-25T15:44:39.092 INFO:tasks.workunit.client.0.vm04.stderr: "image_id": "31c14e595a16", 2026-03-25T15:44:39.092 INFO:tasks.workunit.client.0.vm04.stderr: "image_name": "dummy4", 2026-03-25T15:44:39.092 INFO:tasks.workunit.client.0.vm04.stderr: "pool_name": "rbd2", 2026-03-25T15:44:39.092 INFO:tasks.workunit.client.0.vm04.stderr: "pool_namespace": "" 2026-03-25T15:44:39.093 INFO:tasks.workunit.client.0.vm04.stderr: }, 2026-03-25T15:44:39.093 INFO:tasks.workunit.client.0.vm04.stderr: "sequence": 4 2026-03-25T15:44:39.093 INFO:tasks.workunit.client.0.vm04.stderr: }, 2026-03-25T15:44:39.093 INFO:tasks.workunit.client.0.vm04.stderr: { 2026-03-25T15:44:39.093 INFO:tasks.workunit.client.0.vm04.stderr: "id": "facc41f9-9bf3-4613-8610-04fabb1341da", 2026-03-25T15:44:39.093 INFO:tasks.workunit.client.0.vm04.stderr: "message": "Flattening image rbd2/dummy5", 2026-03-25T15:44:39.093 INFO:tasks.workunit.client.0.vm04.stderr: "refs": { 2026-03-25T15:44:39.093 INFO:tasks.workunit.client.0.vm04.stderr: "action": "flatten", 2026-03-25T15:44:39.093 INFO:tasks.workunit.client.0.vm04.stderr: "image_id": "31c476906855", 2026-03-25T15:44:39.093 INFO:tasks.workunit.client.0.vm04.stderr: "image_name": "dummy5", 2026-03-25T15:44:39.093 INFO:tasks.workunit.client.0.vm04.stderr: "pool_name": "rbd2", 2026-03-25T15:44:39.093 INFO:tasks.workunit.client.0.vm04.stderr: "pool_namespace": "" 2026-03-25T15:44:39.093 INFO:tasks.workunit.client.0.vm04.stderr: }, 2026-03-25T15:44:39.093 INFO:tasks.workunit.client.0.vm04.stderr: "sequence": 5 2026-03-25T15:44:39.093 INFO:tasks.workunit.client.0.vm04.stderr: } 2026-03-25T15:44:39.093 INFO:tasks.workunit.client.0.vm04.stderr:]' '!=' '[]' 2026-03-25T15:44:39.093 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info bar 2026-03-25T15:44:39.093 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'parent: ' 2026-03-25T15:44:39.140 INFO:tasks.workunit.client.0.vm04.stdout: parent: rbd/foo@snap 2026-03-25T15:44:39.140 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail rbd snap unprotect foo@snap 2026-03-25T15:44:39.140 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap unprotect foo@snap 2026-03-25T15:44:39.178 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:44:39.178+0000 7f67461a9640 -1 librbd::SnapshotUnprotectRequest: cannot unprotect: at least 1 child(ren) [31a9f08f608c] in pool 'rbd' 2026-03-25T15:44:39.178 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:44:39.178+0000 7f67461a9640 -1 librbd::SnapshotUnprotectRequest: encountered error: (16) Device or resource busy 2026-03-25T15:44:39.178 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:44:39.178+0000 7f67461a9640 -1 librbd::SnapshotUnprotectRequest: 0x56254afae390 should_complete_error: ret_val=-16 2026-03-25T15:44:39.180 INFO:tasks.workunit.client.0.vm04.stderr:2026-03-25T15:44:39.180+0000 7f67469aa640 -1 librbd::SnapshotUnprotectRequest: 0x56254afae390 should_complete_error: ret_val=-16 2026-03-25T15:44:39.180 INFO:tasks.workunit.client.0.vm04.stderr:rbd: unprotecting snap failed: (16) Device or resource busy 2026-03-25T15:44:39.186 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:44:39.187 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph rbd task add flatten bar 2026-03-25T15:44:39.538 INFO:tasks.workunit.client.0.vm04.stdout:{"sequence": 6, "id": "ce89b55c-3a63-418d-9c24-080ba5c6cc6f", "message": "Flattening image rbd/bar", "refs": {"action": "flatten", "pool_name": "rbd", "pool_namespace": "", "image_name": "bar", "image_id": "31a9f08f608c"}, "retry_attempts": 1, "retry_time": "2026-03-25T15:45:09.483668"} 2026-03-25T15:44:39.550 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in {1..12} 2026-03-25T15:44:39.550 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info bar 2026-03-25T15:44:39.550 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'parent: ' 2026-03-25T15:44:39.581 INFO:tasks.workunit.client.0.vm04.stderr:+ break 2026-03-25T15:44:39.581 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info bar 2026-03-25T15:44:39.581 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail grep 'parent: ' 2026-03-25T15:44:39.581 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'parent: ' 2026-03-25T15:44:39.612 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:44:39.612 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap unprotect foo@snap 2026-03-25T15:44:39.660 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in {1..12} 2026-03-25T15:44:39.660 INFO:tasks.workunit.client.0.vm04.stderr:++ ceph rbd task list 2026-03-25T15:44:39.977 INFO:tasks.workunit.client.0.vm04.stderr:+ test '[]' = '[]' 2026-03-25T15:44:39.977 INFO:tasks.workunit.client.0.vm04.stderr:+ break 2026-03-25T15:44:39.977 INFO:tasks.workunit.client.0.vm04.stderr:++ ceph rbd task list 2026-03-25T15:44:40.304 INFO:tasks.workunit.client.0.vm04.stderr:+ test '[]' = '[]' 2026-03-25T15:44:40.304 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-25T15:44:40.304 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:40.399 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:40.495 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:40.872 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:40.977 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:41.080 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:41.177 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:41.274 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:41.375 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:41.477 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:42.193 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:42.286 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:42.452 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:42.547 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:42.647 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:42.745 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:42.845 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:43.034 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:43.189 INFO:tasks.workunit.client.0.vm04.stderr:+ test_tasks_recovery 2026-03-25T15:44:43.189 INFO:tasks.workunit.client.0.vm04.stderr:+ echo 'testing task handler recovery after module'\''s RADOS client is blocklisted...' 2026-03-25T15:44:43.189 INFO:tasks.workunit.client.0.vm04.stdout:testing task handler recovery after module's RADOS client is blocklisted... 2026-03-25T15:44:43.189 INFO:tasks.workunit.client.0.vm04.stderr:+ remove_images 2026-03-25T15:44:43.189 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:43.290 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:43.390 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:43.489 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:43.589 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:43.688 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:43.786 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:43.883 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:43.981 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:44.085 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:44.184 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:44.283 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:44.379 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:44.480 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:44.608 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:44.712 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:44.812 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:44.914 INFO:tasks.workunit.client.0.vm04.stderr:+ for img in $IMGS 2026-03-25T15:44:45.016 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool create rbd2 8 2026-03-25T15:44:45.400 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd2' already exists 2026-03-25T15:44:45.410 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd pool init rbd2 2026-03-25T15:44:48.359 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd create --image-format 2 --size 1G rbd2/img1 2026-03-25T15:44:48.401 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd bench --io-type write --io-pattern seq --io-size 1M --io-total 1G rbd2/img1 2026-03-25T15:44:48.437 INFO:tasks.workunit.client.0.vm04.stdout:bench type write io_size 1048576 io_threads 16 bytes 1073741824 pattern sequential 2026-03-25T15:44:49.466 INFO:tasks.workunit.client.0.vm04.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-25T15:44:49.466 INFO:tasks.workunit.client.0.vm04.stdout: 1 240 251.968 252 MiB/s 2026-03-25T15:44:50.685 INFO:tasks.workunit.client.0.vm04.stdout: 2 496 229.082 229 MiB/s 2026-03-25T15:44:51.465 INFO:tasks.workunit.client.0.vm04.stdout: 3 720 244.112 244 MiB/s 2026-03-25T15:44:52.513 INFO:tasks.workunit.client.0.vm04.stdout: 4 1008 252.03 252 MiB/s 2026-03-25T15:44:52.817 INFO:tasks.workunit.client.0.vm04.stdout:elapsed: 4 ops: 1024 ops/sec: 233.79 bytes/sec: 234 MiB/s 2026-03-25T15:44:52.825 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap create rbd2/img1@snap 2026-03-25T15:44:53.094 INFO:tasks.workunit.client.0.vm04.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-25T15:44:53.101 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap protect rbd2/img1@snap 2026-03-25T15:44:53.137 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd clone rbd2/img1@snap rbd2/clone1 2026-03-25T15:44:53.188 INFO:tasks.workunit.client.0.vm04.stderr:++ ceph mgr dump 2026-03-25T15:44:53.188 INFO:tasks.workunit.client.0.vm04.stderr:++ jq '.active_clients[]' 2026-03-25T15:44:53.188 INFO:tasks.workunit.client.0.vm04.stderr:++ jq 'select(.name == "rbd_support")' 2026-03-25T15:44:53.188 INFO:tasks.workunit.client.0.vm04.stderr:++ jq -r '[.addrvec[0].addr, "/", .addrvec[0].nonce|tostring] | add' 2026-03-25T15:44:53.501 INFO:tasks.workunit.client.0.vm04.stderr:+ CLIENT_ADDR=192.168.123.104:0/2599108749 2026-03-25T15:44:53.501 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd blocklist add 192.168.123.104:0/2599108749 2026-03-25T15:44:55.288 INFO:tasks.workunit.client.0.vm04.stderr:blocklisting 192.168.123.104:0/2599108749 until 2026-03-25T16:44:54.296557+0000 (3600 sec) 2026-03-25T15:44:55.303 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail ceph rbd task add flatten rbd2/clone1 2026-03-25T15:44:55.303 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph rbd task add flatten rbd2/clone1 2026-03-25T15:44:55.512 INFO:tasks.ceph.mgr.x.vm04.stderr:2026-03-25T15:44:55.512+0000 7fea55b89640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-25T15:44:55.512 INFO:tasks.workunit.client.0.vm04.stderr:Error EAGAIN: rbd_support module is not ready, try again 2026-03-25T15:44:55.518 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:44:55.518 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:45:05.520 INFO:tasks.workunit.client.0.vm04.stderr:++ seq 24 2026-03-25T15:45:05.521 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in `seq 24` 2026-03-25T15:45:05.521 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph rbd task add flatten rbd2/clone1 2026-03-25T15:45:06.440 INFO:tasks.workunit.client.0.vm04.stdout:{"sequence": 1, "id": "5f58d6e1-4a1c-4b73-826b-6d2650801aef", "message": "Flattening image rbd2/clone1", "refs": {"action": "flatten", "pool_name": "rbd2", "pool_namespace": "", "image_name": "clone1", "image_id": "32d773ad4b0"}, "in_progress": true, "progress": 0.03515625} 2026-03-25T15:45:06.461 INFO:tasks.workunit.client.0.vm04.stderr:+ break 2026-03-25T15:45:06.461 INFO:tasks.workunit.client.0.vm04.stderr:++ ceph rbd task list 2026-03-25T15:45:06.825 INFO:tasks.workunit.client.0.vm04.stderr:+ test '[ 2026-03-25T15:45:06.825 INFO:tasks.workunit.client.0.vm04.stderr: { 2026-03-25T15:45:06.825 INFO:tasks.workunit.client.0.vm04.stderr: "id": "5f58d6e1-4a1c-4b73-826b-6d2650801aef", 2026-03-25T15:45:06.825 INFO:tasks.workunit.client.0.vm04.stderr: "in_progress": true, 2026-03-25T15:45:06.825 INFO:tasks.workunit.client.0.vm04.stderr: "message": "Flattening image rbd2/clone1", 2026-03-25T15:45:06.825 INFO:tasks.workunit.client.0.vm04.stderr: "progress": 0.19140625, 2026-03-25T15:45:06.825 INFO:tasks.workunit.client.0.vm04.stderr: "refs": { 2026-03-25T15:45:06.825 INFO:tasks.workunit.client.0.vm04.stderr: "action": "flatten", 2026-03-25T15:45:06.825 INFO:tasks.workunit.client.0.vm04.stderr: "image_id": "32d773ad4b0", 2026-03-25T15:45:06.825 INFO:tasks.workunit.client.0.vm04.stderr: "image_name": "clone1", 2026-03-25T15:45:06.825 INFO:tasks.workunit.client.0.vm04.stderr: "pool_name": "rbd2", 2026-03-25T15:45:06.825 INFO:tasks.workunit.client.0.vm04.stderr: "pool_namespace": "" 2026-03-25T15:45:06.825 INFO:tasks.workunit.client.0.vm04.stderr: }, 2026-03-25T15:45:06.825 INFO:tasks.workunit.client.0.vm04.stderr: "sequence": 1 2026-03-25T15:45:06.825 INFO:tasks.workunit.client.0.vm04.stderr: } 2026-03-25T15:45:06.825 INFO:tasks.workunit.client.0.vm04.stderr:]' '!=' '[]' 2026-03-25T15:45:06.825 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in {1..12} 2026-03-25T15:45:06.826 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info rbd2/clone1 2026-03-25T15:45:06.826 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'parent: ' 2026-03-25T15:45:06.892 INFO:tasks.workunit.client.0.vm04.stdout: parent: rbd2/img1@snap 2026-03-25T15:45:06.892 INFO:tasks.workunit.client.0.vm04.stderr:+ sleep 10 2026-03-25T15:45:16.894 INFO:tasks.workunit.client.0.vm04.stderr:+ for i in {1..12} 2026-03-25T15:45:16.894 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info rbd2/clone1 2026-03-25T15:45:16.894 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'parent: ' 2026-03-25T15:45:16.929 INFO:tasks.workunit.client.0.vm04.stderr:+ break 2026-03-25T15:45:16.929 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd info rbd2/clone1 2026-03-25T15:45:16.929 INFO:tasks.workunit.client.0.vm04.stderr:+ expect_fail grep 'parent: ' 2026-03-25T15:45:16.929 INFO:tasks.workunit.client.0.vm04.stderr:+ grep 'parent: ' 2026-03-25T15:45:16.966 INFO:tasks.workunit.client.0.vm04.stderr:+ return 0 2026-03-25T15:45:16.966 INFO:tasks.workunit.client.0.vm04.stderr:+ rbd snap unprotect rbd2/img1@snap 2026-03-25T15:45:17.007 INFO:tasks.workunit.client.0.vm04.stderr:++ ceph rbd task list 2026-03-25T15:45:17.358 INFO:tasks.workunit.client.0.vm04.stderr:+ test '[]' = '[]' 2026-03-25T15:45:17.358 INFO:tasks.workunit.client.0.vm04.stderr:+ ceph osd pool rm rbd2 rbd2 --yes-i-really-really-mean-it 2026-03-25T15:45:18.002 INFO:tasks.workunit.client.0.vm04.stderr:pool 'rbd2' does not exist 2026-03-25T15:45:18.024 INFO:tasks.workunit.client.0.vm04.stdout:OK 2026-03-25T15:45:18.024 INFO:tasks.workunit.client.0.vm04.stderr:+ echo OK 2026-03-25T15:45:18.025 INFO:teuthology.orchestra.run:Running command with timeout 3600 2026-03-25T15:45:18.025 DEBUG:teuthology.orchestra.run.vm04:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0/tmp 2026-03-25T15:45:18.110 INFO:tasks.workunit:Stopping ['rbd/cli_generic.sh'] on client.0... 2026-03-25T15:45:18.110 DEBUG:teuthology.orchestra.run.vm04:> sudo rm -rf -- /home/ubuntu/cephtest/workunits.list.client.0 /home/ubuntu/cephtest/clone.client.0 2026-03-25T15:45:18.497 DEBUG:teuthology.parallel:result is None 2026-03-25T15:45:18.497 DEBUG:teuthology.orchestra.run.vm04:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0 2026-03-25T15:45:18.522 INFO:tasks.workunit:Deleted dir /home/ubuntu/cephtest/mnt.0/client.0 2026-03-25T15:45:18.522 DEBUG:teuthology.orchestra.run.vm04:> rmdir -- /home/ubuntu/cephtest/mnt.0 2026-03-25T15:45:18.578 INFO:tasks.workunit:Deleted artificial mount point /home/ubuntu/cephtest/mnt.0/client.0 2026-03-25T15:45:18.579 DEBUG:teuthology.run_tasks:Unwinding manager ceph 2026-03-25T15:45:18.581 INFO:tasks.ceph.ceph_manager.ceph:waiting for clean 2026-03-25T15:45:18.581 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph pg dump --format=json 2026-03-25T15:45:18.820 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:45:18.821 INFO:teuthology.orchestra.run.vm04.stderr:dumped all 2026-03-25T15:45:18.834 INFO:teuthology.orchestra.run.vm04.stdout:{"pg_ready":true,"pg_map":{"version":770,"stamp":"2026-03-25T15:45:18.010266+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":590395,"num_objects":9,"num_object_clones":0,"num_object_copies":18,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":9,"num_whiteouts":0,"num_read":83823,"num_read_kb":233753,"num_write":42658,"num_write_kb":8894265,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":7,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":46451,"ondisk_log_size":46451,"up":18,"acting":18,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":34,"num_osds":3,"num_per_pool_osds":3,"num_per_pool_omap_osds":3,"kb":283115520,"kb_used":6393752,"kb_used_data":4201768,"kb_used_omap":1254,"kb_used_meta":2190681,"kb_avail":276721768,"statfs":{"total":289910292480,"available":283363090432,"internally_reserved":0,"allocated":4302610432,"data_stored":4299055816,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1284242,"internal_metadata":2243258222},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":75,"apply_latency_ms":75,"commit_latency_ns":75000000,"apply_latency_ns":75000000},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":-1073742168,"num_objects":-267,"num_object_clones":0,"num_object_copies":-534,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":-267,"num_whiteouts":0,"num_read":-1766,"num_read_kb":-1466,"num_write":-1561,"num_write_kb":-1049105,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":-4,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"10.001497"},"pg_stats":[{"pgid":"2.7","version":"240'4706","reported_seq":9200,"reported_epoch":249,"state":"active+clean","last_fresh":"2026-03-25T15:44:57.650352+0000","last_change":"2026-03-25T15:44:42.032154+0000","last_active":"2026-03-25T15:44:57.650352+0000","last_peered":"2026-03-25T15:44:57.650352+0000","last_clean":"2026-03-25T15:44:57.650352+0000","last_became_active":"2026-03-25T15:27:56.978452+0000","last_became_peered":"2026-03-25T15:27:56.978452+0000","last_unstale":"2026-03-25T15:44:57.650352+0000","last_undegraded":"2026-03-25T15:44:57.650352+0000","last_fullsized":"2026-03-25T15:44:57.650352+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":4706,"log_dups_size":0,"ondisk_log_size":4706,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T16:35:08.519293+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.0012947939999999999,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":8638,"num_read_kb":26386,"num_write":4191,"num_write_kb":1223163,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.6","version":"240'5693","reported_seq":10015,"reported_epoch":249,"state":"active+clean","last_fresh":"2026-03-25T15:44:57.650404+0000","last_change":"2026-03-25T15:44:42.038725+0000","last_active":"2026-03-25T15:44:57.650404+0000","last_peered":"2026-03-25T15:44:57.650404+0000","last_clean":"2026-03-25T15:44:57.650404+0000","last_became_active":"2026-03-25T15:27:56.978427+0000","last_became_peered":"2026-03-25T15:27:56.978427+0000","last_unstale":"2026-03-25T15:44:57.650404+0000","last_undegraded":"2026-03-25T15:44:57.650404+0000","last_fullsized":"2026-03-25T15:44:57.650404+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":5693,"log_dups_size":0,"ondisk_log_size":5693,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T16:54:27.969653+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.0078476510000000006,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":8601,"num_read_kb":25744,"num_write":4420,"num_write_kb":1147587,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.5","version":"240'4531","reported_seq":8077,"reported_epoch":249,"state":"active+clean","last_fresh":"2026-03-25T15:44:57.650399+0000","last_change":"2026-03-25T15:44:42.032413+0000","last_active":"2026-03-25T15:44:57.650399+0000","last_peered":"2026-03-25T15:44:57.650399+0000","last_clean":"2026-03-25T15:44:57.650399+0000","last_became_active":"2026-03-25T15:27:56.978544+0000","last_became_peered":"2026-03-25T15:27:56.978544+0000","last_unstale":"2026-03-25T15:44:57.650399+0000","last_undegraded":"2026-03-25T15:44:57.650399+0000","last_fullsized":"2026-03-25T15:44:57.650399+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":4531,"log_dups_size":0,"ondisk_log_size":4531,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T19:13:48.703930+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.0013904729999999999,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":5956,"num_read_kb":29993,"num_write":4026,"num_write_kb":1129299,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.4","version":"240'7174","reported_seq":12491,"reported_epoch":249,"state":"active+clean","last_fresh":"2026-03-25T15:44:57.650458+0000","last_change":"2026-03-25T15:44:42.032249+0000","last_active":"2026-03-25T15:44:57.650458+0000","last_peered":"2026-03-25T15:44:57.650458+0000","last_clean":"2026-03-25T15:44:57.650458+0000","last_became_active":"2026-03-25T15:27:56.977898+0000","last_became_peered":"2026-03-25T15:27:56.977898+0000","last_unstale":"2026-03-25T15:44:57.650458+0000","last_undegraded":"2026-03-25T15:44:57.650458+0000","last_fullsized":"2026-03-25T15:44:57.650458+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":7174,"log_dups_size":0,"ondisk_log_size":7174,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T23:08:43.358486+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.0011625450000000001,"stat_sum":{"num_bytes":8,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":11406,"num_read_kb":35122,"num_write":5800,"num_write_kb":1069558,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.2","version":"240'4591","reported_seq":9990,"reported_epoch":249,"state":"active+clean","last_fresh":"2026-03-25T15:44:57.493286+0000","last_change":"2026-03-25T15:44:42.029197+0000","last_active":"2026-03-25T15:44:57.493286+0000","last_peered":"2026-03-25T15:44:57.493286+0000","last_clean":"2026-03-25T15:44:57.493286+0000","last_became_active":"2026-03-25T15:27:56.980723+0000","last_became_peered":"2026-03-25T15:27:56.980723+0000","last_unstale":"2026-03-25T15:44:57.493286+0000","last_undegraded":"2026-03-25T15:44:57.493286+0000","last_fullsized":"2026-03-25T15:44:57.493286+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":4591,"log_dups_size":0,"ondisk_log_size":4591,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T18:06:56.373227+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00028202799999999998,"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":7637,"num_read_kb":27432,"num_write":4016,"num_write_kb":1039592,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,1],"acting":[0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.1","version":"240'4352","reported_seq":7745,"reported_epoch":249,"state":"active+clean","last_fresh":"2026-03-25T15:45:01.165809+0000","last_change":"2026-03-25T15:44:42.029939+0000","last_active":"2026-03-25T15:45:01.165809+0000","last_peered":"2026-03-25T15:45:01.165809+0000","last_clean":"2026-03-25T15:45:01.165809+0000","last_became_active":"2026-03-25T15:27:56.977608+0000","last_became_peered":"2026-03-25T15:27:56.977608+0000","last_unstale":"2026-03-25T15:45:01.165809+0000","last_undegraded":"2026-03-25T15:45:01.165809+0000","last_fullsized":"2026-03-25T15:45:01.165809+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":4352,"log_dups_size":0,"ondisk_log_size":4352,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T16:31:48.660269+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00026995600000000002,"stat_sum":{"num_bytes":0,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":5646,"num_read_kb":20721,"num_write":3954,"num_write_kb":1133257,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.0","version":"240'7763","reported_seq":12182,"reported_epoch":249,"state":"active+clean","last_fresh":"2026-03-25T15:44:57.054905+0000","last_change":"2026-03-25T15:44:42.030197+0000","last_active":"2026-03-25T15:44:57.054905+0000","last_peered":"2026-03-25T15:44:57.054905+0000","last_clean":"2026-03-25T15:44:57.054905+0000","last_became_active":"2026-03-25T15:27:56.977549+0000","last_became_peered":"2026-03-25T15:27:56.977549+0000","last_unstale":"2026-03-25T15:44:57.054905+0000","last_undegraded":"2026-03-25T15:44:57.054905+0000","last_fullsized":"2026-03-25T15:44:57.054905+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":7763,"log_dups_size":0,"ondisk_log_size":7763,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T21:31:40.142433+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00052921100000000003,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":19969,"num_read_kb":34601,"num_write":8465,"num_write_kb":1155971,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.3","version":"240'7549","reported_seq":13309,"reported_epoch":249,"state":"active+clean","last_fresh":"2026-03-25T15:44:57.650535+0000","last_change":"2026-03-25T15:44:42.032347+0000","last_active":"2026-03-25T15:44:57.650535+0000","last_peered":"2026-03-25T15:44:57.650535+0000","last_clean":"2026-03-25T15:44:57.650535+0000","last_became_active":"2026-03-25T15:27:56.978893+0000","last_became_peered":"2026-03-25T15:27:56.978893+0000","last_unstale":"2026-03-25T15:44:57.650535+0000","last_undegraded":"2026-03-25T15:44:57.650535+0000","last_fullsized":"2026-03-25T15:44:57.650535+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":7549,"log_dups_size":0,"ondisk_log_size":7549,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-27T01:08:41.251655+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.0011680359999999999,"stat_sum":{"num_bytes":0,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":15894,"num_read_kb":33690,"num_write":7663,"num_write_kb":993688,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":1,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2],"acting":[1,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"1.0","version":"11'92","reported_seq":607,"reported_epoch":249,"state":"active+clean","last_fresh":"2026-03-25T15:44:57.650527+0000","last_change":"2026-03-25T15:27:54.967550+0000","last_active":"2026-03-25T15:44:57.650527+0000","last_peered":"2026-03-25T15:44:57.650527+0000","last_clean":"2026-03-25T15:44:57.650527+0000","last_became_active":"2026-03-25T15:27:54.967418+0000","last_became_peered":"2026-03-25T15:27:54.967418+0000","last_unstale":"2026-03-25T15:44:57.650527+0000","last_undegraded":"2026-03-25T15:44:57.650527+0000","last_fullsized":"2026-03-25T15:44:57.650527+0000","mapping_epoch":10,"log_start":"0'0","ondisk_log_start":"0'0","created":10,"last_epoch_clean":11,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:53.958171+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:53.958171+0000","last_clean_scrub_stamp":"2026-03-25T15:27:53.958171+0000","objects_scrubbed":0,"log_size":92,"log_dups_size":0,"ondisk_log_size":92,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T21:15:15.083319+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":590368,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":76,"num_read_kb":64,"num_write":123,"num_write_kb":2150,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]}],"pool_stats":[{"poolid":2,"num_pg":8,"stat_sum":{"num_bytes":27,"num_objects":7,"num_object_clones":0,"num_object_copies":14,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":7,"num_whiteouts":0,"num_read":83747,"num_read_kb":233689,"num_write":42535,"num_write_kb":8892115,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":7,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":16384,"data_stored":54,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1009749,"internal_metadata":0},"log_size":46359,"ondisk_log_size":46359,"up":16,"acting":16,"num_store_stats":3},{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":590368,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":76,"num_read_kb":64,"num_write":123,"num_write_kb":2150,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":1187840,"data_stored":1180736,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":92,"ondisk_log_size":92,"up":2,"acting":2,"num_store_stats":2}],"osd_stats":[{"osd":1,"up_from":9,"seq":38654705874,"num_pgs":13,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":1717256,"kb_used_data":932472,"kb_used_omap":491,"kb_used_meta":784276,"kb_avail":92654584,"statfs":{"total":96636764160,"available":94878294016,"internally_reserved":0,"allocated":954851328,"data_stored":953666550,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":503627,"internal_metadata":803098805},"hb_peers":[0,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":31,"apply_latency_ms":31,"commit_latency_ns":31000000,"apply_latency_ns":31000000},"alerts":[]},{"osd":0,"up_from":9,"seq":38654705875,"num_pgs":14,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":2812436,"kb_used_data":2099844,"kb_used_omap":405,"kb_used_meta":712170,"kb_avail":91559404,"statfs":{"total":96636764160,"available":93756829696,"internally_reserved":0,"allocated":2150240256,"data_stored":2149043401,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":415279,"internal_metadata":729262545},"hb_peers":[1,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":44,"apply_latency_ms":44,"commit_latency_ns":44000000,"apply_latency_ns":44000000},"alerts":[]},{"osd":2,"up_from":8,"seq":34359738579,"num_pgs":7,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":1864060,"kb_used_data":1169452,"kb_used_omap":356,"kb_used_meta":694235,"kb_avail":92507780,"statfs":{"total":96636764160,"available":94727966720,"internally_reserved":0,"allocated":1197518848,"data_stored":1196345865,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":365336,"internal_metadata":710896872},"hb_peers":[0,1],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":593920,"data_stored":590368,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":593920,"data_stored":590368,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":27,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":229441,"internal_metadata":0},{"poolid":2,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":27,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":416568,"internal_metadata":0},{"poolid":2,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":363740,"internal_metadata":0}]}} 2026-03-25T15:45:18.834 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph pg dump --format=json 2026-03-25T15:45:19.087 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:45:19.087 INFO:teuthology.orchestra.run.vm04.stderr:dumped all 2026-03-25T15:45:19.099 INFO:teuthology.orchestra.run.vm04.stdout:{"pg_ready":true,"pg_map":{"version":770,"stamp":"2026-03-25T15:45:18.010266+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":590395,"num_objects":9,"num_object_clones":0,"num_object_copies":18,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":9,"num_whiteouts":0,"num_read":83823,"num_read_kb":233753,"num_write":42658,"num_write_kb":8894265,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":7,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":46451,"ondisk_log_size":46451,"up":18,"acting":18,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":34,"num_osds":3,"num_per_pool_osds":3,"num_per_pool_omap_osds":3,"kb":283115520,"kb_used":6393752,"kb_used_data":4201768,"kb_used_omap":1254,"kb_used_meta":2190681,"kb_avail":276721768,"statfs":{"total":289910292480,"available":283363090432,"internally_reserved":0,"allocated":4302610432,"data_stored":4299055816,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1284242,"internal_metadata":2243258222},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":75,"apply_latency_ms":75,"commit_latency_ns":75000000,"apply_latency_ns":75000000},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":-1073742168,"num_objects":-267,"num_object_clones":0,"num_object_copies":-534,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":-267,"num_whiteouts":0,"num_read":-1766,"num_read_kb":-1466,"num_write":-1561,"num_write_kb":-1049105,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":-4,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"10.001497"},"pg_stats":[{"pgid":"2.7","version":"240'4706","reported_seq":9200,"reported_epoch":249,"state":"active+clean","last_fresh":"2026-03-25T15:44:57.650352+0000","last_change":"2026-03-25T15:44:42.032154+0000","last_active":"2026-03-25T15:44:57.650352+0000","last_peered":"2026-03-25T15:44:57.650352+0000","last_clean":"2026-03-25T15:44:57.650352+0000","last_became_active":"2026-03-25T15:27:56.978452+0000","last_became_peered":"2026-03-25T15:27:56.978452+0000","last_unstale":"2026-03-25T15:44:57.650352+0000","last_undegraded":"2026-03-25T15:44:57.650352+0000","last_fullsized":"2026-03-25T15:44:57.650352+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":4706,"log_dups_size":0,"ondisk_log_size":4706,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T16:35:08.519293+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.0012947939999999999,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":8638,"num_read_kb":26386,"num_write":4191,"num_write_kb":1223163,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.6","version":"240'5693","reported_seq":10015,"reported_epoch":249,"state":"active+clean","last_fresh":"2026-03-25T15:44:57.650404+0000","last_change":"2026-03-25T15:44:42.038725+0000","last_active":"2026-03-25T15:44:57.650404+0000","last_peered":"2026-03-25T15:44:57.650404+0000","last_clean":"2026-03-25T15:44:57.650404+0000","last_became_active":"2026-03-25T15:27:56.978427+0000","last_became_peered":"2026-03-25T15:27:56.978427+0000","last_unstale":"2026-03-25T15:44:57.650404+0000","last_undegraded":"2026-03-25T15:44:57.650404+0000","last_fullsized":"2026-03-25T15:44:57.650404+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":5693,"log_dups_size":0,"ondisk_log_size":5693,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T16:54:27.969653+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.0078476510000000006,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":8601,"num_read_kb":25744,"num_write":4420,"num_write_kb":1147587,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.5","version":"240'4531","reported_seq":8077,"reported_epoch":249,"state":"active+clean","last_fresh":"2026-03-25T15:44:57.650399+0000","last_change":"2026-03-25T15:44:42.032413+0000","last_active":"2026-03-25T15:44:57.650399+0000","last_peered":"2026-03-25T15:44:57.650399+0000","last_clean":"2026-03-25T15:44:57.650399+0000","last_became_active":"2026-03-25T15:27:56.978544+0000","last_became_peered":"2026-03-25T15:27:56.978544+0000","last_unstale":"2026-03-25T15:44:57.650399+0000","last_undegraded":"2026-03-25T15:44:57.650399+0000","last_fullsized":"2026-03-25T15:44:57.650399+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":4531,"log_dups_size":0,"ondisk_log_size":4531,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T19:13:48.703930+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.0013904729999999999,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":5956,"num_read_kb":29993,"num_write":4026,"num_write_kb":1129299,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.4","version":"240'7174","reported_seq":12491,"reported_epoch":249,"state":"active+clean","last_fresh":"2026-03-25T15:44:57.650458+0000","last_change":"2026-03-25T15:44:42.032249+0000","last_active":"2026-03-25T15:44:57.650458+0000","last_peered":"2026-03-25T15:44:57.650458+0000","last_clean":"2026-03-25T15:44:57.650458+0000","last_became_active":"2026-03-25T15:27:56.977898+0000","last_became_peered":"2026-03-25T15:27:56.977898+0000","last_unstale":"2026-03-25T15:44:57.650458+0000","last_undegraded":"2026-03-25T15:44:57.650458+0000","last_fullsized":"2026-03-25T15:44:57.650458+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":7174,"log_dups_size":0,"ondisk_log_size":7174,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T23:08:43.358486+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.0011625450000000001,"stat_sum":{"num_bytes":8,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":11406,"num_read_kb":35122,"num_write":5800,"num_write_kb":1069558,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.2","version":"240'4591","reported_seq":9990,"reported_epoch":249,"state":"active+clean","last_fresh":"2026-03-25T15:44:57.493286+0000","last_change":"2026-03-25T15:44:42.029197+0000","last_active":"2026-03-25T15:44:57.493286+0000","last_peered":"2026-03-25T15:44:57.493286+0000","last_clean":"2026-03-25T15:44:57.493286+0000","last_became_active":"2026-03-25T15:27:56.980723+0000","last_became_peered":"2026-03-25T15:27:56.980723+0000","last_unstale":"2026-03-25T15:44:57.493286+0000","last_undegraded":"2026-03-25T15:44:57.493286+0000","last_fullsized":"2026-03-25T15:44:57.493286+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":4591,"log_dups_size":0,"ondisk_log_size":4591,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T18:06:56.373227+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00028202799999999998,"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":7637,"num_read_kb":27432,"num_write":4016,"num_write_kb":1039592,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,1],"acting":[0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.1","version":"240'4352","reported_seq":7745,"reported_epoch":249,"state":"active+clean","last_fresh":"2026-03-25T15:45:01.165809+0000","last_change":"2026-03-25T15:44:42.029939+0000","last_active":"2026-03-25T15:45:01.165809+0000","last_peered":"2026-03-25T15:45:01.165809+0000","last_clean":"2026-03-25T15:45:01.165809+0000","last_became_active":"2026-03-25T15:27:56.977608+0000","last_became_peered":"2026-03-25T15:27:56.977608+0000","last_unstale":"2026-03-25T15:45:01.165809+0000","last_undegraded":"2026-03-25T15:45:01.165809+0000","last_fullsized":"2026-03-25T15:45:01.165809+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":4352,"log_dups_size":0,"ondisk_log_size":4352,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T16:31:48.660269+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00026995600000000002,"stat_sum":{"num_bytes":0,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":5646,"num_read_kb":20721,"num_write":3954,"num_write_kb":1133257,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.0","version":"240'7763","reported_seq":12182,"reported_epoch":249,"state":"active+clean","last_fresh":"2026-03-25T15:44:57.054905+0000","last_change":"2026-03-25T15:44:42.030197+0000","last_active":"2026-03-25T15:44:57.054905+0000","last_peered":"2026-03-25T15:44:57.054905+0000","last_clean":"2026-03-25T15:44:57.054905+0000","last_became_active":"2026-03-25T15:27:56.977549+0000","last_became_peered":"2026-03-25T15:27:56.977549+0000","last_unstale":"2026-03-25T15:44:57.054905+0000","last_undegraded":"2026-03-25T15:44:57.054905+0000","last_fullsized":"2026-03-25T15:44:57.054905+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":7763,"log_dups_size":0,"ondisk_log_size":7763,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T21:31:40.142433+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00052921100000000003,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":19969,"num_read_kb":34601,"num_write":8465,"num_write_kb":1155971,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.3","version":"240'7549","reported_seq":13309,"reported_epoch":249,"state":"active+clean","last_fresh":"2026-03-25T15:44:57.650535+0000","last_change":"2026-03-25T15:44:42.032347+0000","last_active":"2026-03-25T15:44:57.650535+0000","last_peered":"2026-03-25T15:44:57.650535+0000","last_clean":"2026-03-25T15:44:57.650535+0000","last_became_active":"2026-03-25T15:27:56.978893+0000","last_became_peered":"2026-03-25T15:27:56.978893+0000","last_unstale":"2026-03-25T15:44:57.650535+0000","last_undegraded":"2026-03-25T15:44:57.650535+0000","last_fullsized":"2026-03-25T15:44:57.650535+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":7549,"log_dups_size":0,"ondisk_log_size":7549,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-27T01:08:41.251655+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.0011680359999999999,"stat_sum":{"num_bytes":0,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":15894,"num_read_kb":33690,"num_write":7663,"num_write_kb":993688,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":1,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2],"acting":[1,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"1.0","version":"11'92","reported_seq":607,"reported_epoch":249,"state":"active+clean","last_fresh":"2026-03-25T15:44:57.650527+0000","last_change":"2026-03-25T15:27:54.967550+0000","last_active":"2026-03-25T15:44:57.650527+0000","last_peered":"2026-03-25T15:44:57.650527+0000","last_clean":"2026-03-25T15:44:57.650527+0000","last_became_active":"2026-03-25T15:27:54.967418+0000","last_became_peered":"2026-03-25T15:27:54.967418+0000","last_unstale":"2026-03-25T15:44:57.650527+0000","last_undegraded":"2026-03-25T15:44:57.650527+0000","last_fullsized":"2026-03-25T15:44:57.650527+0000","mapping_epoch":10,"log_start":"0'0","ondisk_log_start":"0'0","created":10,"last_epoch_clean":11,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:53.958171+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:53.958171+0000","last_clean_scrub_stamp":"2026-03-25T15:27:53.958171+0000","objects_scrubbed":0,"log_size":92,"log_dups_size":0,"ondisk_log_size":92,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T21:15:15.083319+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":590368,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":76,"num_read_kb":64,"num_write":123,"num_write_kb":2150,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]}],"pool_stats":[{"poolid":2,"num_pg":8,"stat_sum":{"num_bytes":27,"num_objects":7,"num_object_clones":0,"num_object_copies":14,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":7,"num_whiteouts":0,"num_read":83747,"num_read_kb":233689,"num_write":42535,"num_write_kb":8892115,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":7,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":16384,"data_stored":54,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1009749,"internal_metadata":0},"log_size":46359,"ondisk_log_size":46359,"up":16,"acting":16,"num_store_stats":3},{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":590368,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":76,"num_read_kb":64,"num_write":123,"num_write_kb":2150,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":1187840,"data_stored":1180736,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":92,"ondisk_log_size":92,"up":2,"acting":2,"num_store_stats":2}],"osd_stats":[{"osd":1,"up_from":9,"seq":38654705874,"num_pgs":13,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":1717256,"kb_used_data":932472,"kb_used_omap":491,"kb_used_meta":784276,"kb_avail":92654584,"statfs":{"total":96636764160,"available":94878294016,"internally_reserved":0,"allocated":954851328,"data_stored":953666550,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":503627,"internal_metadata":803098805},"hb_peers":[0,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":31,"apply_latency_ms":31,"commit_latency_ns":31000000,"apply_latency_ns":31000000},"alerts":[]},{"osd":0,"up_from":9,"seq":38654705875,"num_pgs":14,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":2812436,"kb_used_data":2099844,"kb_used_omap":405,"kb_used_meta":712170,"kb_avail":91559404,"statfs":{"total":96636764160,"available":93756829696,"internally_reserved":0,"allocated":2150240256,"data_stored":2149043401,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":415279,"internal_metadata":729262545},"hb_peers":[1,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":44,"apply_latency_ms":44,"commit_latency_ns":44000000,"apply_latency_ns":44000000},"alerts":[]},{"osd":2,"up_from":8,"seq":34359738579,"num_pgs":7,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":1864060,"kb_used_data":1169452,"kb_used_omap":356,"kb_used_meta":694235,"kb_avail":92507780,"statfs":{"total":96636764160,"available":94727966720,"internally_reserved":0,"allocated":1197518848,"data_stored":1196345865,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":365336,"internal_metadata":710896872},"hb_peers":[0,1],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":593920,"data_stored":590368,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":593920,"data_stored":590368,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":27,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":229441,"internal_metadata":0},{"poolid":2,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":27,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":416568,"internal_metadata":0},{"poolid":2,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":363740,"internal_metadata":0}]}} 2026-03-25T15:45:19.100 INFO:tasks.ceph.ceph_manager.ceph:clean! 2026-03-25T15:45:19.100 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph pg dump --format=json 2026-03-25T15:45:19.299 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:45:19.300 INFO:teuthology.orchestra.run.vm04.stderr:dumped all 2026-03-25T15:45:19.311 INFO:teuthology.orchestra.run.vm04.stdout:{"pg_ready":true,"pg_map":{"version":770,"stamp":"2026-03-25T15:45:18.010266+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":590395,"num_objects":9,"num_object_clones":0,"num_object_copies":18,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":9,"num_whiteouts":0,"num_read":83823,"num_read_kb":233753,"num_write":42658,"num_write_kb":8894265,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":7,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":46451,"ondisk_log_size":46451,"up":18,"acting":18,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":34,"num_osds":3,"num_per_pool_osds":3,"num_per_pool_omap_osds":3,"kb":283115520,"kb_used":6393752,"kb_used_data":4201768,"kb_used_omap":1254,"kb_used_meta":2190681,"kb_avail":276721768,"statfs":{"total":289910292480,"available":283363090432,"internally_reserved":0,"allocated":4302610432,"data_stored":4299055816,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1284242,"internal_metadata":2243258222},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":75,"apply_latency_ms":75,"commit_latency_ns":75000000,"apply_latency_ns":75000000},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":-1073742168,"num_objects":-267,"num_object_clones":0,"num_object_copies":-534,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":-267,"num_whiteouts":0,"num_read":-1766,"num_read_kb":-1466,"num_write":-1561,"num_write_kb":-1049105,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":-4,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"10.001497"},"pg_stats":[{"pgid":"2.7","version":"240'4706","reported_seq":9200,"reported_epoch":249,"state":"active+clean","last_fresh":"2026-03-25T15:44:57.650352+0000","last_change":"2026-03-25T15:44:42.032154+0000","last_active":"2026-03-25T15:44:57.650352+0000","last_peered":"2026-03-25T15:44:57.650352+0000","last_clean":"2026-03-25T15:44:57.650352+0000","last_became_active":"2026-03-25T15:27:56.978452+0000","last_became_peered":"2026-03-25T15:27:56.978452+0000","last_unstale":"2026-03-25T15:44:57.650352+0000","last_undegraded":"2026-03-25T15:44:57.650352+0000","last_fullsized":"2026-03-25T15:44:57.650352+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":4706,"log_dups_size":0,"ondisk_log_size":4706,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T16:35:08.519293+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.0012947939999999999,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":8638,"num_read_kb":26386,"num_write":4191,"num_write_kb":1223163,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.6","version":"240'5693","reported_seq":10015,"reported_epoch":249,"state":"active+clean","last_fresh":"2026-03-25T15:44:57.650404+0000","last_change":"2026-03-25T15:44:42.038725+0000","last_active":"2026-03-25T15:44:57.650404+0000","last_peered":"2026-03-25T15:44:57.650404+0000","last_clean":"2026-03-25T15:44:57.650404+0000","last_became_active":"2026-03-25T15:27:56.978427+0000","last_became_peered":"2026-03-25T15:27:56.978427+0000","last_unstale":"2026-03-25T15:44:57.650404+0000","last_undegraded":"2026-03-25T15:44:57.650404+0000","last_fullsized":"2026-03-25T15:44:57.650404+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":5693,"log_dups_size":0,"ondisk_log_size":5693,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T16:54:27.969653+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.0078476510000000006,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":8601,"num_read_kb":25744,"num_write":4420,"num_write_kb":1147587,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.5","version":"240'4531","reported_seq":8077,"reported_epoch":249,"state":"active+clean","last_fresh":"2026-03-25T15:44:57.650399+0000","last_change":"2026-03-25T15:44:42.032413+0000","last_active":"2026-03-25T15:44:57.650399+0000","last_peered":"2026-03-25T15:44:57.650399+0000","last_clean":"2026-03-25T15:44:57.650399+0000","last_became_active":"2026-03-25T15:27:56.978544+0000","last_became_peered":"2026-03-25T15:27:56.978544+0000","last_unstale":"2026-03-25T15:44:57.650399+0000","last_undegraded":"2026-03-25T15:44:57.650399+0000","last_fullsized":"2026-03-25T15:44:57.650399+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":4531,"log_dups_size":0,"ondisk_log_size":4531,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T19:13:48.703930+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.0013904729999999999,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":5956,"num_read_kb":29993,"num_write":4026,"num_write_kb":1129299,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.4","version":"240'7174","reported_seq":12491,"reported_epoch":249,"state":"active+clean","last_fresh":"2026-03-25T15:44:57.650458+0000","last_change":"2026-03-25T15:44:42.032249+0000","last_active":"2026-03-25T15:44:57.650458+0000","last_peered":"2026-03-25T15:44:57.650458+0000","last_clean":"2026-03-25T15:44:57.650458+0000","last_became_active":"2026-03-25T15:27:56.977898+0000","last_became_peered":"2026-03-25T15:27:56.977898+0000","last_unstale":"2026-03-25T15:44:57.650458+0000","last_undegraded":"2026-03-25T15:44:57.650458+0000","last_fullsized":"2026-03-25T15:44:57.650458+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":7174,"log_dups_size":0,"ondisk_log_size":7174,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T23:08:43.358486+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.0011625450000000001,"stat_sum":{"num_bytes":8,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":11406,"num_read_kb":35122,"num_write":5800,"num_write_kb":1069558,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.2","version":"240'4591","reported_seq":9990,"reported_epoch":249,"state":"active+clean","last_fresh":"2026-03-25T15:44:57.493286+0000","last_change":"2026-03-25T15:44:42.029197+0000","last_active":"2026-03-25T15:44:57.493286+0000","last_peered":"2026-03-25T15:44:57.493286+0000","last_clean":"2026-03-25T15:44:57.493286+0000","last_became_active":"2026-03-25T15:27:56.980723+0000","last_became_peered":"2026-03-25T15:27:56.980723+0000","last_unstale":"2026-03-25T15:44:57.493286+0000","last_undegraded":"2026-03-25T15:44:57.493286+0000","last_fullsized":"2026-03-25T15:44:57.493286+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":4591,"log_dups_size":0,"ondisk_log_size":4591,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T18:06:56.373227+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00028202799999999998,"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":7637,"num_read_kb":27432,"num_write":4016,"num_write_kb":1039592,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,1],"acting":[0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.1","version":"240'4352","reported_seq":7745,"reported_epoch":249,"state":"active+clean","last_fresh":"2026-03-25T15:45:01.165809+0000","last_change":"2026-03-25T15:44:42.029939+0000","last_active":"2026-03-25T15:45:01.165809+0000","last_peered":"2026-03-25T15:45:01.165809+0000","last_clean":"2026-03-25T15:45:01.165809+0000","last_became_active":"2026-03-25T15:27:56.977608+0000","last_became_peered":"2026-03-25T15:27:56.977608+0000","last_unstale":"2026-03-25T15:45:01.165809+0000","last_undegraded":"2026-03-25T15:45:01.165809+0000","last_fullsized":"2026-03-25T15:45:01.165809+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":4352,"log_dups_size":0,"ondisk_log_size":4352,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T16:31:48.660269+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00026995600000000002,"stat_sum":{"num_bytes":0,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":5646,"num_read_kb":20721,"num_write":3954,"num_write_kb":1133257,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.0","version":"240'7763","reported_seq":12182,"reported_epoch":249,"state":"active+clean","last_fresh":"2026-03-25T15:44:57.054905+0000","last_change":"2026-03-25T15:44:42.030197+0000","last_active":"2026-03-25T15:44:57.054905+0000","last_peered":"2026-03-25T15:44:57.054905+0000","last_clean":"2026-03-25T15:44:57.054905+0000","last_became_active":"2026-03-25T15:27:56.977549+0000","last_became_peered":"2026-03-25T15:27:56.977549+0000","last_unstale":"2026-03-25T15:44:57.054905+0000","last_undegraded":"2026-03-25T15:44:57.054905+0000","last_fullsized":"2026-03-25T15:44:57.054905+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":7763,"log_dups_size":0,"ondisk_log_size":7763,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T21:31:40.142433+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00052921100000000003,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":19969,"num_read_kb":34601,"num_write":8465,"num_write_kb":1155971,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.3","version":"240'7549","reported_seq":13309,"reported_epoch":249,"state":"active+clean","last_fresh":"2026-03-25T15:44:57.650535+0000","last_change":"2026-03-25T15:44:42.032347+0000","last_active":"2026-03-25T15:44:57.650535+0000","last_peered":"2026-03-25T15:44:57.650535+0000","last_clean":"2026-03-25T15:44:57.650535+0000","last_became_active":"2026-03-25T15:27:56.978893+0000","last_became_peered":"2026-03-25T15:27:56.978893+0000","last_unstale":"2026-03-25T15:44:57.650535+0000","last_undegraded":"2026-03-25T15:44:57.650535+0000","last_fullsized":"2026-03-25T15:44:57.650535+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":7549,"log_dups_size":0,"ondisk_log_size":7549,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-27T01:08:41.251655+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.0011680359999999999,"stat_sum":{"num_bytes":0,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":15894,"num_read_kb":33690,"num_write":7663,"num_write_kb":993688,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":1,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2],"acting":[1,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"1.0","version":"11'92","reported_seq":607,"reported_epoch":249,"state":"active+clean","last_fresh":"2026-03-25T15:44:57.650527+0000","last_change":"2026-03-25T15:27:54.967550+0000","last_active":"2026-03-25T15:44:57.650527+0000","last_peered":"2026-03-25T15:44:57.650527+0000","last_clean":"2026-03-25T15:44:57.650527+0000","last_became_active":"2026-03-25T15:27:54.967418+0000","last_became_peered":"2026-03-25T15:27:54.967418+0000","last_unstale":"2026-03-25T15:44:57.650527+0000","last_undegraded":"2026-03-25T15:44:57.650527+0000","last_fullsized":"2026-03-25T15:44:57.650527+0000","mapping_epoch":10,"log_start":"0'0","ondisk_log_start":"0'0","created":10,"last_epoch_clean":11,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:53.958171+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:53.958171+0000","last_clean_scrub_stamp":"2026-03-25T15:27:53.958171+0000","objects_scrubbed":0,"log_size":92,"log_dups_size":0,"ondisk_log_size":92,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T21:15:15.083319+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":590368,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":76,"num_read_kb":64,"num_write":123,"num_write_kb":2150,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]}],"pool_stats":[{"poolid":2,"num_pg":8,"stat_sum":{"num_bytes":27,"num_objects":7,"num_object_clones":0,"num_object_copies":14,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":7,"num_whiteouts":0,"num_read":83747,"num_read_kb":233689,"num_write":42535,"num_write_kb":8892115,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":7,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":16384,"data_stored":54,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1009749,"internal_metadata":0},"log_size":46359,"ondisk_log_size":46359,"up":16,"acting":16,"num_store_stats":3},{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":590368,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":76,"num_read_kb":64,"num_write":123,"num_write_kb":2150,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":1187840,"data_stored":1180736,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":92,"ondisk_log_size":92,"up":2,"acting":2,"num_store_stats":2}],"osd_stats":[{"osd":1,"up_from":9,"seq":38654705874,"num_pgs":13,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":1717256,"kb_used_data":932472,"kb_used_omap":491,"kb_used_meta":784276,"kb_avail":92654584,"statfs":{"total":96636764160,"available":94878294016,"internally_reserved":0,"allocated":954851328,"data_stored":953666550,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":503627,"internal_metadata":803098805},"hb_peers":[0,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":31,"apply_latency_ms":31,"commit_latency_ns":31000000,"apply_latency_ns":31000000},"alerts":[]},{"osd":0,"up_from":9,"seq":38654705875,"num_pgs":14,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":2812436,"kb_used_data":2099844,"kb_used_omap":405,"kb_used_meta":712170,"kb_avail":91559404,"statfs":{"total":96636764160,"available":93756829696,"internally_reserved":0,"allocated":2150240256,"data_stored":2149043401,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":415279,"internal_metadata":729262545},"hb_peers":[1,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":44,"apply_latency_ms":44,"commit_latency_ns":44000000,"apply_latency_ns":44000000},"alerts":[]},{"osd":2,"up_from":8,"seq":34359738579,"num_pgs":7,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":1864060,"kb_used_data":1169452,"kb_used_omap":356,"kb_used_meta":694235,"kb_avail":92507780,"statfs":{"total":96636764160,"available":94727966720,"internally_reserved":0,"allocated":1197518848,"data_stored":1196345865,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":365336,"internal_metadata":710896872},"hb_peers":[0,1],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":593920,"data_stored":590368,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":593920,"data_stored":590368,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":27,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":229441,"internal_metadata":0},{"poolid":2,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":27,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":416568,"internal_metadata":0},{"poolid":2,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":363740,"internal_metadata":0}]}} 2026-03-25T15:45:19.312 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd dump --format=json 2026-03-25T15:45:19.712 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:45:19.712 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":250,"fsid":"08196b8a-fd91-49b2-b8a6-e1d21f829086","created":"2026-03-25T15:27:46.739365+0000","modified":"2026-03-25T15:45:17.892926+0000","last_up_change":"2026-03-25T15:27:51.947587+0000","last_in_change":"2026-03-25T15:27:47.831101+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":5,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":17,"max_osd":3,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"tentacle","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-25T15:27:53.853201+0000","flags":1,"flags_names":"hashpspool","type":1,"size":2,"min_size":1,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"is_stretch_pool":false,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"12","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"nonprimary_shards":"{}","options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_type":"Fair distribution","score_acting":2.9900000095367432,"score_stable":2.9900000095367432,"optimal_score":0.67000001668930054,"raw_score_acting":2,"raw_score_stable":2,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}},{"pool":2,"pool_name":"rbd","create_time":"2026-03-25T15:27:55.461429+0000","flags":8193,"flags_names":"hashpspool,selfmanaged_snaps","type":1,"size":2,"min_size":1,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"is_stretch_pool":false,"object_hash":2,"pg_autoscale_mode":"off","pg_num":8,"pg_placement_num":8,"pg_placement_num_target":8,"pg_num_target":8,"pg_num_pending":8,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"240","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":50,"snap_epoch":240,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"nonprimary_shards":"{}","options":{},"application_metadata":{"rbd":{}},"read_balance":{"score_type":"Fair distribution","score_acting":1.8799999952316284,"score_stable":1.8799999952316284,"optimal_score":1,"raw_score_acting":1.8799999952316284,"raw_score_stable":1.8799999952316284,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"4f49ea48-ae3c-407d-a183-56efa44295aa","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":9,"up_thru":241,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6816","nonce":2768043570},{"type":"v1","addr":"192.168.123.104:6817","nonce":2768043570}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6818","nonce":2768043570},{"type":"v1","addr":"192.168.123.104:6819","nonce":2768043570}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6822","nonce":2768043570},{"type":"v1","addr":"192.168.123.104:6823","nonce":2768043570}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6820","nonce":2768043570},{"type":"v1","addr":"192.168.123.104:6821","nonce":2768043570}]},"public_addr":"192.168.123.104:6817/2768043570","cluster_addr":"192.168.123.104:6819/2768043570","heartbeat_back_addr":"192.168.123.104:6823/2768043570","heartbeat_front_addr":"192.168.123.104:6821/2768043570","state":["exists","up"]},{"osd":1,"uuid":"f2612533-a570-4836-a45a-2ae6235f25d4","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":9,"up_thru":241,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6808","nonce":2830815940},{"type":"v1","addr":"192.168.123.104:6809","nonce":2830815940}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6810","nonce":2830815940},{"type":"v1","addr":"192.168.123.104:6811","nonce":2830815940}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6814","nonce":2830815940},{"type":"v1","addr":"192.168.123.104:6815","nonce":2830815940}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6812","nonce":2830815940},{"type":"v1","addr":"192.168.123.104:6813","nonce":2830815940}]},"public_addr":"192.168.123.104:6809/2830815940","cluster_addr":"192.168.123.104:6811/2830815940","heartbeat_back_addr":"192.168.123.104:6815/2830815940","heartbeat_front_addr":"192.168.123.104:6813/2830815940","state":["exists","up"]},{"osd":2,"uuid":"e252769d-790d-4ead-9e7c-79a5105f87f1","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":232,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6800","nonce":2246359187},{"type":"v1","addr":"192.168.123.104:6801","nonce":2246359187}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6802","nonce":2246359187},{"type":"v1","addr":"192.168.123.104:6803","nonce":2246359187}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6806","nonce":2246359187},{"type":"v1","addr":"192.168.123.104:6807","nonce":2246359187}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6804","nonce":2246359187},{"type":"v1","addr":"192.168.123.104:6805","nonce":2246359187}]},"public_addr":"192.168.123.104:6801/2246359187","cluster_addr":"192.168.123.104:6803/2246359187","heartbeat_back_addr":"192.168.123.104:6807/2246359187","heartbeat_front_addr":"192.168.123.104:6805/2246359187","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"2026-03-25T15:27:49.995642+0000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"2026-03-25T15:27:49.933284+0000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"2026-03-25T15:27:49.709787+0000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{"192.168.123.104:0/2599108749":"2026-03-25T16:44:53.691718+0000","192.168.123.104:0/3705412354":"2026-03-25T16:41:02.616643+0000","192.168.123.104:0/2646297707":"2026-03-25T16:31:15.478922+0000","192.168.123.104:0/1093454813":"2026-03-25T16:43:39.654882+0000","192.168.123.104:0/3897111736":"2026-03-25T16:36:15.668474+0000","192.168.123.104:0/1340742903":"2026-03-25T16:31:13.578693+0000","192.168.123.104:0/4292970630":"2026-03-25T16:31:14.517039+0000","192.168.123.104:0/3687224706":"2026-03-25T16:30:37.990931+0000","192.168.123.104:0/4132638779":"2026-03-25T16:31:13.183104+0000","192.168.123.104:0/635302791":"2026-03-25T16:30:36.920346+0000"},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"isa","technique":"reed_sol_van"}},"removed_snaps_queue":[{"pool":3,"snaps":[{"begin":2,"length":2}]},{"pool":4,"snaps":[{"begin":2,"length":2}]},{"pool":6,"snaps":[{"begin":3,"length":1}]},{"pool":10,"snaps":[{"begin":3,"length":2}]},{"pool":14,"snaps":[{"begin":2,"length":1}]},{"pool":15,"snaps":[{"begin":2,"length":1}]}],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-25T15:45:20.725 INFO:tasks.ceph:Scrubbing osd.0 2026-03-25T15:45:20.725 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph tell osd.0 config set osd_debug_deep_scrub_sleep 0 2026-03-25T15:45:20.808 INFO:teuthology.orchestra.run.vm04.stdout:{ 2026-03-25T15:45:20.809 INFO:teuthology.orchestra.run.vm04.stdout: "success": "osd_debug_deep_scrub_sleep = '' (not observed, change may require restart) " 2026-03-25T15:45:20.809 INFO:teuthology.orchestra.run.vm04.stdout:} 2026-03-25T15:45:20.817 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd deep-scrub 0 2026-03-25T15:45:21.019 INFO:teuthology.orchestra.run.vm04.stderr:instructed osd(s) 0 to deep-scrub 2026-03-25T15:45:21.035 INFO:tasks.ceph:Scrubbing osd.1 2026-03-25T15:45:21.035 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph tell osd.1 config set osd_debug_deep_scrub_sleep 0 2026-03-25T15:45:21.128 INFO:teuthology.orchestra.run.vm04.stdout:{ 2026-03-25T15:45:21.128 INFO:teuthology.orchestra.run.vm04.stdout: "success": "osd_debug_deep_scrub_sleep = '' (not observed, change may require restart) " 2026-03-25T15:45:21.128 INFO:teuthology.orchestra.run.vm04.stdout:} 2026-03-25T15:45:21.137 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd deep-scrub 1 2026-03-25T15:45:21.342 INFO:teuthology.orchestra.run.vm04.stderr:instructed osd(s) 1 to deep-scrub 2026-03-25T15:45:21.357 INFO:tasks.ceph:Scrubbing osd.2 2026-03-25T15:45:21.357 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph tell osd.2 config set osd_debug_deep_scrub_sleep 0 2026-03-25T15:45:21.451 INFO:teuthology.orchestra.run.vm04.stdout:{ 2026-03-25T15:45:21.452 INFO:teuthology.orchestra.run.vm04.stdout: "success": "osd_debug_deep_scrub_sleep = '' (not observed, change may require restart) " 2026-03-25T15:45:21.452 INFO:teuthology.orchestra.run.vm04.stdout:} 2026-03-25T15:45:21.461 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd deep-scrub 2 2026-03-25T15:45:21.668 INFO:teuthology.orchestra.run.vm04.stderr:instructed osd(s) 2 to deep-scrub 2026-03-25T15:45:21.680 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph pg dump --format=json 2026-03-25T15:45:21.877 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:45:21.878 INFO:teuthology.orchestra.run.vm04.stderr:dumped all 2026-03-25T15:45:21.890 INFO:teuthology.orchestra.run.vm04.stdout:{"pg_ready":true,"pg_map":{"version":771,"stamp":"2026-03-25T15:45:20.010603+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":590395,"num_objects":9,"num_object_clones":0,"num_object_copies":18,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":9,"num_whiteouts":0,"num_read":83823,"num_read_kb":233753,"num_write":42658,"num_write_kb":8894265,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":7,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":46451,"ondisk_log_size":46451,"up":18,"acting":18,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":34,"num_osds":3,"num_per_pool_osds":3,"num_per_pool_omap_osds":3,"kb":283115520,"kb_used":5463968,"kb_used_data":3271976,"kb_used_omap":1255,"kb_used_meta":2190680,"kb_avail":277651552,"statfs":{"total":289910292480,"available":284315189248,"internally_reserved":0,"allocated":3350503424,"data_stored":3346952769,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1285605,"internal_metadata":2243256859},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":22,"apply_latency_ms":22,"commit_latency_ns":22000000,"apply_latency_ns":22000000},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":-1543504216,"num_objects":-380,"num_object_clones":0,"num_object_copies":-760,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":-380,"num_whiteouts":0,"num_read":-2945,"num_read_kb":-506061,"num_write":-2018,"num_write_kb":-1508088,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":-5,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"10.001582"},"pg_stats":[{"pgid":"2.7","version":"240'4706","reported_seq":9203,"reported_epoch":250,"state":"active+clean","last_fresh":"2026-03-25T15:45:17.901456+0000","last_change":"2026-03-25T15:44:42.032154+0000","last_active":"2026-03-25T15:45:17.901456+0000","last_peered":"2026-03-25T15:45:17.901456+0000","last_clean":"2026-03-25T15:45:17.901456+0000","last_became_active":"2026-03-25T15:27:56.978452+0000","last_became_peered":"2026-03-25T15:27:56.978452+0000","last_unstale":"2026-03-25T15:45:17.901456+0000","last_undegraded":"2026-03-25T15:45:17.901456+0000","last_fullsized":"2026-03-25T15:45:17.901456+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":4706,"log_dups_size":0,"ondisk_log_size":4706,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T16:35:08.519293+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.0012947939999999999,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":8638,"num_read_kb":26386,"num_write":4191,"num_write_kb":1223163,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.6","version":"240'5693","reported_seq":10018,"reported_epoch":250,"state":"active+clean","last_fresh":"2026-03-25T15:45:17.901461+0000","last_change":"2026-03-25T15:44:42.038725+0000","last_active":"2026-03-25T15:45:17.901461+0000","last_peered":"2026-03-25T15:45:17.901461+0000","last_clean":"2026-03-25T15:45:17.901461+0000","last_became_active":"2026-03-25T15:27:56.978427+0000","last_became_peered":"2026-03-25T15:27:56.978427+0000","last_unstale":"2026-03-25T15:45:17.901461+0000","last_undegraded":"2026-03-25T15:45:17.901461+0000","last_fullsized":"2026-03-25T15:45:17.901461+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":5693,"log_dups_size":0,"ondisk_log_size":5693,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T16:54:27.969653+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.0078476510000000006,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":8601,"num_read_kb":25744,"num_write":4420,"num_write_kb":1147587,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.5","version":"240'4531","reported_seq":8079,"reported_epoch":250,"state":"active+clean","last_fresh":"2026-03-25T15:45:17.901500+0000","last_change":"2026-03-25T15:44:42.032413+0000","last_active":"2026-03-25T15:45:17.901500+0000","last_peered":"2026-03-25T15:45:17.901500+0000","last_clean":"2026-03-25T15:45:17.901500+0000","last_became_active":"2026-03-25T15:27:56.978544+0000","last_became_peered":"2026-03-25T15:27:56.978544+0000","last_unstale":"2026-03-25T15:45:17.901500+0000","last_undegraded":"2026-03-25T15:45:17.901500+0000","last_fullsized":"2026-03-25T15:45:17.901500+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":4531,"log_dups_size":0,"ondisk_log_size":4531,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T19:13:48.703930+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.0013904729999999999,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":5956,"num_read_kb":29993,"num_write":4026,"num_write_kb":1129299,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.4","version":"240'7174","reported_seq":12493,"reported_epoch":250,"state":"active+clean","last_fresh":"2026-03-25T15:45:17.901500+0000","last_change":"2026-03-25T15:44:42.032249+0000","last_active":"2026-03-25T15:45:17.901500+0000","last_peered":"2026-03-25T15:45:17.901500+0000","last_clean":"2026-03-25T15:45:17.901500+0000","last_became_active":"2026-03-25T15:27:56.977898+0000","last_became_peered":"2026-03-25T15:27:56.977898+0000","last_unstale":"2026-03-25T15:45:17.901500+0000","last_undegraded":"2026-03-25T15:45:17.901500+0000","last_fullsized":"2026-03-25T15:45:17.901500+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":7174,"log_dups_size":0,"ondisk_log_size":7174,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T23:08:43.358486+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.0011625450000000001,"stat_sum":{"num_bytes":8,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":11406,"num_read_kb":35122,"num_write":5800,"num_write_kb":1069558,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.2","version":"240'4591","reported_seq":9992,"reported_epoch":250,"state":"active+clean","last_fresh":"2026-03-25T15:45:17.898917+0000","last_change":"2026-03-25T15:44:42.029197+0000","last_active":"2026-03-25T15:45:17.898917+0000","last_peered":"2026-03-25T15:45:17.898917+0000","last_clean":"2026-03-25T15:45:17.898917+0000","last_became_active":"2026-03-25T15:27:56.980723+0000","last_became_peered":"2026-03-25T15:27:56.980723+0000","last_unstale":"2026-03-25T15:45:17.898917+0000","last_undegraded":"2026-03-25T15:45:17.898917+0000","last_fullsized":"2026-03-25T15:45:17.898917+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":4591,"log_dups_size":0,"ondisk_log_size":4591,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T18:06:56.373227+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00028202799999999998,"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":7637,"num_read_kb":27432,"num_write":4016,"num_write_kb":1039592,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,1],"acting":[0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.1","version":"240'4352","reported_seq":7745,"reported_epoch":249,"state":"active+clean","last_fresh":"2026-03-25T15:45:01.165809+0000","last_change":"2026-03-25T15:44:42.029939+0000","last_active":"2026-03-25T15:45:01.165809+0000","last_peered":"2026-03-25T15:45:01.165809+0000","last_clean":"2026-03-25T15:45:01.165809+0000","last_became_active":"2026-03-25T15:27:56.977608+0000","last_became_peered":"2026-03-25T15:27:56.977608+0000","last_unstale":"2026-03-25T15:45:01.165809+0000","last_undegraded":"2026-03-25T15:45:01.165809+0000","last_fullsized":"2026-03-25T15:45:01.165809+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":4352,"log_dups_size":0,"ondisk_log_size":4352,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T16:31:48.660269+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00026995600000000002,"stat_sum":{"num_bytes":0,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":5646,"num_read_kb":20721,"num_write":3954,"num_write_kb":1133257,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.0","version":"240'7763","reported_seq":12182,"reported_epoch":249,"state":"active+clean","last_fresh":"2026-03-25T15:44:57.054905+0000","last_change":"2026-03-25T15:44:42.030197+0000","last_active":"2026-03-25T15:44:57.054905+0000","last_peered":"2026-03-25T15:44:57.054905+0000","last_clean":"2026-03-25T15:44:57.054905+0000","last_became_active":"2026-03-25T15:27:56.977549+0000","last_became_peered":"2026-03-25T15:27:56.977549+0000","last_unstale":"2026-03-25T15:44:57.054905+0000","last_undegraded":"2026-03-25T15:44:57.054905+0000","last_fullsized":"2026-03-25T15:44:57.054905+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":7763,"log_dups_size":0,"ondisk_log_size":7763,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T21:31:40.142433+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00052921100000000003,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":19969,"num_read_kb":34601,"num_write":8465,"num_write_kb":1155971,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.3","version":"240'7549","reported_seq":13311,"reported_epoch":250,"state":"active+clean","last_fresh":"2026-03-25T15:45:17.901635+0000","last_change":"2026-03-25T15:44:42.032347+0000","last_active":"2026-03-25T15:45:17.901635+0000","last_peered":"2026-03-25T15:45:17.901635+0000","last_clean":"2026-03-25T15:45:17.901635+0000","last_became_active":"2026-03-25T15:27:56.978893+0000","last_became_peered":"2026-03-25T15:27:56.978893+0000","last_unstale":"2026-03-25T15:45:17.901635+0000","last_undegraded":"2026-03-25T15:45:17.901635+0000","last_fullsized":"2026-03-25T15:45:17.901635+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:55.966994+0000","last_clean_scrub_stamp":"2026-03-25T15:27:55.966994+0000","objects_scrubbed":0,"log_size":7549,"log_dups_size":0,"ondisk_log_size":7549,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-27T01:08:41.251655+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.0011680359999999999,"stat_sum":{"num_bytes":0,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":15894,"num_read_kb":33690,"num_write":7663,"num_write_kb":993688,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":1,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2],"acting":[1,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"1.0","version":"11'92","reported_seq":609,"reported_epoch":250,"state":"active+clean","last_fresh":"2026-03-25T15:45:17.901576+0000","last_change":"2026-03-25T15:27:54.967550+0000","last_active":"2026-03-25T15:45:17.901576+0000","last_peered":"2026-03-25T15:45:17.901576+0000","last_clean":"2026-03-25T15:45:17.901576+0000","last_became_active":"2026-03-25T15:27:54.967418+0000","last_became_peered":"2026-03-25T15:27:54.967418+0000","last_unstale":"2026-03-25T15:45:17.901576+0000","last_undegraded":"2026-03-25T15:45:17.901576+0000","last_fullsized":"2026-03-25T15:45:17.901576+0000","mapping_epoch":10,"log_start":"0'0","ondisk_log_start":"0'0","created":10,"last_epoch_clean":11,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-25T15:27:53.958171+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-25T15:27:53.958171+0000","last_clean_scrub_stamp":"2026-03-25T15:27:53.958171+0000","objects_scrubbed":0,"log_size":92,"log_dups_size":0,"ondisk_log_size":92,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T21:15:15.083319+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":590368,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":76,"num_read_kb":64,"num_write":123,"num_write_kb":2150,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]}],"pool_stats":[{"poolid":2,"num_pg":8,"stat_sum":{"num_bytes":27,"num_objects":7,"num_object_clones":0,"num_object_copies":14,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":7,"num_whiteouts":0,"num_read":83747,"num_read_kb":233689,"num_write":42535,"num_write_kb":8892115,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":7,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":16384,"data_stored":54,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1001183,"internal_metadata":0},"log_size":46359,"ondisk_log_size":46359,"up":16,"acting":16,"num_store_stats":3},{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":590368,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":76,"num_read_kb":64,"num_write":123,"num_write_kb":2150,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":1187840,"data_stored":1180736,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":92,"ondisk_log_size":92,"up":2,"acting":2,"num_store_stats":2}],"osd_stats":[{"osd":1,"up_from":9,"seq":38654705875,"num_pgs":13,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":787472,"kb_used_data":2680,"kb_used_omap":493,"kb_used_meta":784274,"kb_avail":93584368,"statfs":{"total":96636764160,"available":95830392832,"internally_reserved":0,"allocated":2744320,"data_stored":1563503,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":504882,"internal_metadata":803097550},"hb_peers":[0,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":8,"apply_latency_ms":8,"commit_latency_ns":8000000,"apply_latency_ns":8000000},"alerts":[]},{"osd":0,"up_from":9,"seq":38654705876,"num_pgs":14,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":2812436,"kb_used_data":2099844,"kb_used_omap":405,"kb_used_meta":712170,"kb_avail":91559404,"statfs":{"total":96636764160,"available":93756829696,"internally_reserved":0,"allocated":2150240256,"data_stored":2149043401,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":415387,"internal_metadata":729262437},"hb_peers":[1,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":14,"apply_latency_ms":14,"commit_latency_ns":14000000,"apply_latency_ns":14000000},"alerts":[]},{"osd":2,"up_from":8,"seq":34359738579,"num_pgs":7,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":1864060,"kb_used_data":1169452,"kb_used_omap":356,"kb_used_meta":694235,"kb_avail":92507780,"statfs":{"total":96636764160,"available":94727966720,"internally_reserved":0,"allocated":1197518848,"data_stored":1196345865,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":365336,"internal_metadata":710896872},"hb_peers":[0,1],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":593920,"data_stored":590368,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":593920,"data_stored":590368,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":27,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":220775,"internal_metadata":0},{"poolid":2,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":27,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":416668,"internal_metadata":0},{"poolid":2,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":363740,"internal_metadata":0}]}} 2026-03-25T15:45:21.891 INFO:tasks.ceph:pgid 2.7 last_scrub_stamp 2026-03-25T15:27:55.966994+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=25, tm_hour=15, tm_min=27, tm_sec=55, tm_wday=2, tm_yday=84, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=25, tm_hour=15, tm_min=45, tm_sec=19, tm_wday=2, tm_yday=84, tm_isdst=0) 2026-03-25T15:45:21.891 INFO:tasks.ceph:pgid 2.6 last_scrub_stamp 2026-03-25T15:27:55.966994+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=25, tm_hour=15, tm_min=27, tm_sec=55, tm_wday=2, tm_yday=84, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=25, tm_hour=15, tm_min=45, tm_sec=19, tm_wday=2, tm_yday=84, tm_isdst=0) 2026-03-25T15:45:21.891 INFO:tasks.ceph:pgid 2.5 last_scrub_stamp 2026-03-25T15:27:55.966994+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=25, tm_hour=15, tm_min=27, tm_sec=55, tm_wday=2, tm_yday=84, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=25, tm_hour=15, tm_min=45, tm_sec=19, tm_wday=2, tm_yday=84, tm_isdst=0) 2026-03-25T15:45:21.892 INFO:tasks.ceph:pgid 2.4 last_scrub_stamp 2026-03-25T15:27:55.966994+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=25, tm_hour=15, tm_min=27, tm_sec=55, tm_wday=2, tm_yday=84, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=25, tm_hour=15, tm_min=45, tm_sec=19, tm_wday=2, tm_yday=84, tm_isdst=0) 2026-03-25T15:45:21.892 INFO:tasks.ceph:pgid 2.2 last_scrub_stamp 2026-03-25T15:27:55.966994+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=25, tm_hour=15, tm_min=27, tm_sec=55, tm_wday=2, tm_yday=84, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=25, tm_hour=15, tm_min=45, tm_sec=19, tm_wday=2, tm_yday=84, tm_isdst=0) 2026-03-25T15:45:21.892 INFO:tasks.ceph:pgid 2.1 last_scrub_stamp 2026-03-25T15:27:55.966994+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=25, tm_hour=15, tm_min=27, tm_sec=55, tm_wday=2, tm_yday=84, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=25, tm_hour=15, tm_min=45, tm_sec=19, tm_wday=2, tm_yday=84, tm_isdst=0) 2026-03-25T15:45:21.892 INFO:tasks.ceph:pgid 2.0 last_scrub_stamp 2026-03-25T15:27:55.966994+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=25, tm_hour=15, tm_min=27, tm_sec=55, tm_wday=2, tm_yday=84, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=25, tm_hour=15, tm_min=45, tm_sec=19, tm_wday=2, tm_yday=84, tm_isdst=0) 2026-03-25T15:45:21.892 INFO:tasks.ceph:pgid 2.3 last_scrub_stamp 2026-03-25T15:27:55.966994+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=25, tm_hour=15, tm_min=27, tm_sec=55, tm_wday=2, tm_yday=84, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=25, tm_hour=15, tm_min=45, tm_sec=19, tm_wday=2, tm_yday=84, tm_isdst=0) 2026-03-25T15:45:21.892 INFO:tasks.ceph:pgid 1.0 last_scrub_stamp 2026-03-25T15:27:53.958171+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=25, tm_hour=15, tm_min=27, tm_sec=53, tm_wday=2, tm_yday=84, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=25, tm_hour=15, tm_min=45, tm_sec=19, tm_wday=2, tm_yday=84, tm_isdst=0) 2026-03-25T15:45:21.892 INFO:tasks.ceph:Still waiting for all pgs to be scrubbed. 2026-03-25T15:45:41.892 DEBUG:teuthology.orchestra.run.vm04:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph pg dump --format=json 2026-03-25T15:45:42.102 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:45:42.102 INFO:teuthology.orchestra.run.vm04.stderr:dumped all 2026-03-25T15:45:42.114 INFO:teuthology.orchestra.run.vm04.stdout:{"pg_ready":true,"pg_map":{"version":782,"stamp":"2026-03-25T15:45:42.013626+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":590395,"num_objects":9,"num_object_clones":0,"num_object_copies":18,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":9,"num_whiteouts":0,"num_read":83823,"num_read_kb":233753,"num_write":42658,"num_write_kb":8894265,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":7,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":46460,"ondisk_log_size":46460,"up":18,"acting":18,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":18,"num_osds":3,"num_per_pool_osds":3,"num_per_pool_omap_osds":3,"kb":283115520,"kb_used":2199436,"kb_used_data":7452,"kb_used_omap":1255,"kb_used_meta":2190680,"kb_avail":280916084,"statfs":{"total":289910292480,"available":287658070016,"internally_reserved":0,"allocated":7630848,"data_stored":4100114,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1285908,"internal_metadata":2243256556},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"12.001690"},"pg_stats":[{"pgid":"2.7","version":"240'4706","reported_seq":9211,"reported_epoch":250,"state":"active+clean","last_fresh":"2026-03-25T15:45:25.594619+0000","last_change":"2026-03-25T15:45:25.594619+0000","last_active":"2026-03-25T15:45:25.594619+0000","last_peered":"2026-03-25T15:45:25.594619+0000","last_clean":"2026-03-25T15:45:25.594619+0000","last_became_active":"2026-03-25T15:27:56.978452+0000","last_became_peered":"2026-03-25T15:27:56.978452+0000","last_unstale":"2026-03-25T15:45:25.594619+0000","last_undegraded":"2026-03-25T15:45:25.594619+0000","last_fullsized":"2026-03-25T15:45:25.594619+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"240'4706","last_scrub_stamp":"2026-03-25T15:45:25.594572+0000","last_deep_scrub":"240'4706","last_deep_scrub_stamp":"2026-03-25T15:45:25.594572+0000","last_clean_scrub_stamp":"2026-03-25T15:45:25.594572+0000","objects_scrubbed":0,"log_size":4706,"log_dups_size":0,"ondisk_log_size":4706,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":1,"scrub_schedule":"periodic scrub scheduled @ 2026-03-27T02:21:02.753593+0000","scrub_duration":5,"objects_trimmed":0,"snaptrim_duration":0.0012947939999999999,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":8638,"num_read_kb":26386,"num_write":4191,"num_write_kb":1223163,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.6","version":"240'5693","reported_seq":10026,"reported_epoch":250,"state":"active+clean","last_fresh":"2026-03-25T15:45:24.627140+0000","last_change":"2026-03-25T15:45:24.627140+0000","last_active":"2026-03-25T15:45:24.627140+0000","last_peered":"2026-03-25T15:45:24.627140+0000","last_clean":"2026-03-25T15:45:24.627140+0000","last_became_active":"2026-03-25T15:27:56.978427+0000","last_became_peered":"2026-03-25T15:27:56.978427+0000","last_unstale":"2026-03-25T15:45:24.627140+0000","last_undegraded":"2026-03-25T15:45:24.627140+0000","last_fullsized":"2026-03-25T15:45:24.627140+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"240'5693","last_scrub_stamp":"2026-03-25T15:45:24.627022+0000","last_deep_scrub":"240'5693","last_deep_scrub_stamp":"2026-03-25T15:45:24.627022+0000","last_clean_scrub_stamp":"2026-03-25T15:45:24.627022+0000","objects_scrubbed":0,"log_size":5693,"log_dups_size":0,"ondisk_log_size":5693,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":1,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T22:33:50.347282+0000","scrub_duration":4,"objects_trimmed":0,"snaptrim_duration":0.0078476510000000006,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":8601,"num_read_kb":25744,"num_write":4420,"num_write_kb":1147587,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.5","version":"240'4531","reported_seq":8087,"reported_epoch":250,"state":"active+clean","last_fresh":"2026-03-25T15:45:23.663685+0000","last_change":"2026-03-25T15:45:23.663685+0000","last_active":"2026-03-25T15:45:23.663685+0000","last_peered":"2026-03-25T15:45:23.663685+0000","last_clean":"2026-03-25T15:45:23.663685+0000","last_became_active":"2026-03-25T15:27:56.978544+0000","last_became_peered":"2026-03-25T15:27:56.978544+0000","last_unstale":"2026-03-25T15:45:23.663685+0000","last_undegraded":"2026-03-25T15:45:23.663685+0000","last_fullsized":"2026-03-25T15:45:23.663685+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"240'4531","last_scrub_stamp":"2026-03-25T15:45:23.663637+0000","last_deep_scrub":"240'4531","last_deep_scrub_stamp":"2026-03-25T15:45:23.663637+0000","last_clean_scrub_stamp":"2026-03-25T15:45:23.663637+0000","objects_scrubbed":0,"log_size":4531,"log_dups_size":0,"ondisk_log_size":4531,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":1,"scrub_schedule":"periodic scrub scheduled @ 2026-03-27T00:49:30.117052+0000","scrub_duration":4,"objects_trimmed":0,"snaptrim_duration":0.0013904729999999999,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":5956,"num_read_kb":29993,"num_write":4026,"num_write_kb":1129299,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.4","version":"250'7176","reported_seq":12503,"reported_epoch":250,"state":"active+clean","last_fresh":"2026-03-25T15:45:26.575391+0000","last_change":"2026-03-25T15:45:26.575391+0000","last_active":"2026-03-25T15:45:26.575391+0000","last_peered":"2026-03-25T15:45:26.575391+0000","last_clean":"2026-03-25T15:45:26.575391+0000","last_became_active":"2026-03-25T15:27:56.977898+0000","last_became_peered":"2026-03-25T15:27:56.977898+0000","last_unstale":"2026-03-25T15:45:26.575391+0000","last_undegraded":"2026-03-25T15:45:26.575391+0000","last_fullsized":"2026-03-25T15:45:26.575391+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"250'7176","last_scrub_stamp":"2026-03-25T15:45:26.575356+0000","last_deep_scrub":"250'7176","last_deep_scrub_stamp":"2026-03-25T15:45:26.575356+0000","last_clean_scrub_stamp":"2026-03-25T15:45:26.575356+0000","objects_scrubbed":2,"log_size":7176,"log_dups_size":0,"ondisk_log_size":7176,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":1,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T23:16:12.243819+0000","scrub_duration":11,"objects_trimmed":0,"snaptrim_duration":0.0011625450000000001,"stat_sum":{"num_bytes":8,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":11406,"num_read_kb":35122,"num_write":5800,"num_write_kb":1069558,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.2","version":"250'4593","reported_seq":10002,"reported_epoch":250,"state":"active+clean","last_fresh":"2026-03-25T15:45:21.430236+0000","last_change":"2026-03-25T15:45:21.430236+0000","last_active":"2026-03-25T15:45:21.430236+0000","last_peered":"2026-03-25T15:45:21.430236+0000","last_clean":"2026-03-25T15:45:21.430236+0000","last_became_active":"2026-03-25T15:27:56.980723+0000","last_became_peered":"2026-03-25T15:27:56.980723+0000","last_unstale":"2026-03-25T15:45:21.430236+0000","last_undegraded":"2026-03-25T15:45:21.430236+0000","last_fullsized":"2026-03-25T15:45:21.430236+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"250'4593","last_scrub_stamp":"2026-03-25T15:45:21.430165+0000","last_deep_scrub":"250'4593","last_deep_scrub_stamp":"2026-03-25T15:45:21.430165+0000","last_clean_scrub_stamp":"2026-03-25T15:45:21.430165+0000","objects_scrubbed":2,"log_size":4593,"log_dups_size":0,"ondisk_log_size":4593,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":1,"scrub_schedule":"periodic scrub scheduled @ 2026-03-27T00:27:38.722647+0000","scrub_duration":14,"objects_trimmed":0,"snaptrim_duration":0.00028202799999999998,"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":7637,"num_read_kb":27432,"num_write":4016,"num_write_kb":1039592,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,1],"acting":[0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.1","version":"250'4354","reported_seq":7757,"reported_epoch":250,"state":"active+clean","last_fresh":"2026-03-25T15:45:22.191222+0000","last_change":"2026-03-25T15:45:22.191222+0000","last_active":"2026-03-25T15:45:22.191222+0000","last_peered":"2026-03-25T15:45:22.191222+0000","last_clean":"2026-03-25T15:45:22.191222+0000","last_became_active":"2026-03-25T15:27:56.977608+0000","last_became_peered":"2026-03-25T15:27:56.977608+0000","last_unstale":"2026-03-25T15:45:22.191222+0000","last_undegraded":"2026-03-25T15:45:22.191222+0000","last_fullsized":"2026-03-25T15:45:22.191222+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"250'4354","last_scrub_stamp":"2026-03-25T15:45:22.191184+0000","last_deep_scrub":"250'4354","last_deep_scrub_stamp":"2026-03-25T15:45:22.191184+0000","last_clean_scrub_stamp":"2026-03-25T15:45:22.191184+0000","objects_scrubbed":2,"log_size":4354,"log_dups_size":0,"ondisk_log_size":4354,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":1,"scrub_schedule":"periodic scrub scheduled @ 2026-03-27T01:37:20.161696+0000","scrub_duration":10,"objects_trimmed":0,"snaptrim_duration":0.00026995600000000002,"stat_sum":{"num_bytes":0,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":5646,"num_read_kb":20721,"num_write":3954,"num_write_kb":1133257,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.0","version":"240'7763","reported_seq":12192,"reported_epoch":250,"state":"active+clean","last_fresh":"2026-03-25T15:45:23.143286+0000","last_change":"2026-03-25T15:45:23.143286+0000","last_active":"2026-03-25T15:45:23.143286+0000","last_peered":"2026-03-25T15:45:23.143286+0000","last_clean":"2026-03-25T15:45:23.143286+0000","last_became_active":"2026-03-25T15:27:56.977549+0000","last_became_peered":"2026-03-25T15:27:56.977549+0000","last_unstale":"2026-03-25T15:45:23.143286+0000","last_undegraded":"2026-03-25T15:45:23.143286+0000","last_fullsized":"2026-03-25T15:45:23.143286+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"240'7763","last_scrub_stamp":"2026-03-25T15:45:23.143236+0000","last_deep_scrub":"240'7763","last_deep_scrub_stamp":"2026-03-25T15:45:23.143236+0000","last_clean_scrub_stamp":"2026-03-25T15:45:23.143236+0000","objects_scrubbed":0,"log_size":7763,"log_dups_size":0,"ondisk_log_size":7763,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":1,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T18:02:34.491834+0000","scrub_duration":5,"objects_trimmed":0,"snaptrim_duration":0.00052921100000000003,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":19969,"num_read_kb":34601,"num_write":8465,"num_write_kb":1155971,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.3","version":"250'7550","reported_seq":13320,"reported_epoch":250,"state":"active+clean","last_fresh":"2026-03-25T15:45:22.646642+0000","last_change":"2026-03-25T15:45:22.646642+0000","last_active":"2026-03-25T15:45:22.646642+0000","last_peered":"2026-03-25T15:45:22.646642+0000","last_clean":"2026-03-25T15:45:22.646642+0000","last_became_active":"2026-03-25T15:27:56.978893+0000","last_became_peered":"2026-03-25T15:27:56.978893+0000","last_unstale":"2026-03-25T15:45:22.646642+0000","last_undegraded":"2026-03-25T15:45:22.646642+0000","last_fullsized":"2026-03-25T15:45:22.646642+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"250'7550","last_scrub_stamp":"2026-03-25T15:45:22.646607+0000","last_deep_scrub":"250'7550","last_deep_scrub_stamp":"2026-03-25T15:45:22.646607+0000","last_clean_scrub_stamp":"2026-03-25T15:45:22.646607+0000","objects_scrubbed":1,"log_size":7550,"log_dups_size":0,"ondisk_log_size":7550,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":1,"scrub_schedule":"periodic scrub scheduled @ 2026-03-27T00:24:42.412613+0000","scrub_duration":9,"objects_trimmed":0,"snaptrim_duration":0.0011680359999999999,"stat_sum":{"num_bytes":0,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":15894,"num_read_kb":33690,"num_write":7663,"num_write_kb":993688,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":1,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2],"acting":[1,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"1.0","version":"250'94","reported_seq":619,"reported_epoch":250,"state":"active+clean","last_fresh":"2026-03-25T15:45:21.606595+0000","last_change":"2026-03-25T15:45:21.606595+0000","last_active":"2026-03-25T15:45:21.606595+0000","last_peered":"2026-03-25T15:45:21.606595+0000","last_clean":"2026-03-25T15:45:21.606595+0000","last_became_active":"2026-03-25T15:27:54.967418+0000","last_became_peered":"2026-03-25T15:27:54.967418+0000","last_unstale":"2026-03-25T15:45:21.606595+0000","last_undegraded":"2026-03-25T15:45:21.606595+0000","last_fullsized":"2026-03-25T15:45:21.606595+0000","mapping_epoch":10,"log_start":"0'0","ondisk_log_start":"0'0","created":10,"last_epoch_clean":11,"parent":"0.0","parent_split_bits":0,"last_scrub":"250'94","last_scrub_stamp":"2026-03-25T15:45:21.606560+0000","last_deep_scrub":"250'94","last_deep_scrub_stamp":"2026-03-25T15:45:21.606560+0000","last_clean_scrub_stamp":"2026-03-25T15:45:21.606560+0000","objects_scrubbed":2,"log_size":94,"log_dups_size":0,"ondisk_log_size":94,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":1,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T17:42:34.150092+0000","scrub_duration":4,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":590368,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":76,"num_read_kb":64,"num_write":123,"num_write_kb":2150,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]}],"pool_stats":[{"poolid":2,"num_pg":8,"stat_sum":{"num_bytes":27,"num_objects":7,"num_object_clones":0,"num_object_copies":14,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":7,"num_whiteouts":0,"num_read":83747,"num_read_kb":233689,"num_write":42535,"num_write_kb":8892115,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":7,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":16384,"data_stored":54,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1001318,"internal_metadata":0},"log_size":46366,"ondisk_log_size":46366,"up":16,"acting":16,"num_store_stats":3},{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":590368,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":76,"num_read_kb":64,"num_write":123,"num_write_kb":2150,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":1187840,"data_stored":1180736,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":94,"ondisk_log_size":94,"up":2,"acting":2,"num_store_stats":2}],"osd_stats":[{"osd":1,"up_from":9,"seq":38654705879,"num_pgs":9,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":787464,"kb_used_data":2680,"kb_used_omap":493,"kb_used_meta":784274,"kb_avail":93584376,"statfs":{"total":96636764160,"available":95830401024,"internally_reserved":0,"allocated":2744320,"data_stored":1563503,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":504882,"internal_metadata":803097550},"hb_peers":[0,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]},{"osd":0,"up_from":9,"seq":38654705880,"num_pgs":6,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":715272,"kb_used_data":2680,"kb_used_omap":405,"kb_used_meta":712170,"kb_avail":93656568,"statfs":{"total":96636764160,"available":95904325632,"internally_reserved":0,"allocated":2744320,"data_stored":1563503,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":415555,"internal_metadata":729262269},"hb_peers":[1,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]},{"osd":2,"up_from":8,"seq":34359738583,"num_pgs":3,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":696700,"kb_used_data":2092,"kb_used_omap":356,"kb_used_meta":694235,"kb_avail":93675140,"statfs":{"total":96636764160,"available":95923343360,"internally_reserved":0,"allocated":2142208,"data_stored":973108,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":365471,"internal_metadata":710896737},"hb_peers":[0,1],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":593920,"data_stored":590368,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":593920,"data_stored":590368,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":27,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":220775,"internal_metadata":0},{"poolid":2,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":27,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":416668,"internal_metadata":0},{"poolid":2,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":363875,"internal_metadata":0}]}} 2026-03-25T15:45:42.114 DEBUG:teuthology.orchestra.run.vm04:> sudo ceph --cluster ceph config set global mon_health_to_clog false 2026-03-25T15:45:42.323 INFO:teuthology.misc:Shutting down mds daemons... 2026-03-25T15:45:42.323 INFO:teuthology.misc:Shutting down osd daemons... 2026-03-25T15:45:42.323 DEBUG:tasks.ceph.osd.0:waiting for process to exit 2026-03-25T15:45:42.323 INFO:teuthology.orchestra.run:waiting for 300 2026-03-25T15:45:42.390 INFO:tasks.ceph.osd.0:Stopped 2026-03-25T15:45:42.390 DEBUG:tasks.ceph.osd.1:waiting for process to exit 2026-03-25T15:45:42.391 INFO:teuthology.orchestra.run:waiting for 300 2026-03-25T15:45:42.469 INFO:tasks.ceph.osd.1:Stopped 2026-03-25T15:45:42.470 DEBUG:tasks.ceph.osd.2:waiting for process to exit 2026-03-25T15:45:42.470 INFO:teuthology.orchestra.run:waiting for 300 2026-03-25T15:45:42.530 INFO:tasks.ceph.osd.2:Stopped 2026-03-25T15:45:42.530 INFO:teuthology.misc:Shutting down mgr daemons... 2026-03-25T15:45:42.530 DEBUG:tasks.ceph.mgr.x:waiting for process to exit 2026-03-25T15:45:42.530 INFO:teuthology.orchestra.run:waiting for 300 2026-03-25T15:45:42.575 INFO:tasks.ceph.mgr.x:Stopped 2026-03-25T15:45:42.575 INFO:teuthology.misc:Shutting down mon daemons... 2026-03-25T15:45:42.576 DEBUG:tasks.ceph.mon.a:waiting for process to exit 2026-03-25T15:45:42.576 INFO:teuthology.orchestra.run:waiting for 300 2026-03-25T15:45:42.587 INFO:tasks.ceph.mon.a:Stopped 2026-03-25T15:45:42.587 INFO:tasks.ceph:Checking cluster log for badness... 2026-03-25T15:45:42.587 DEBUG:teuthology.orchestra.run.vm04:> sudo egrep '\[ERR\]|\[WRN\]|\[SEC\]' /var/log/ceph/ceph.log | egrep -v '\(MDS_ALL_DOWN\)' | egrep -v '\(MDS_UP_LESS_THAN_MAX\)' | egrep -v '\(OSD_SLOW_PING_TIME' | head -n 1 2026-03-25T15:45:42.654 INFO:tasks.ceph:Unmounting /var/lib/ceph/osd/ceph-0 on ubuntu@vm04.local 2026-03-25T15:45:42.654 DEBUG:teuthology.orchestra.run.vm04:> sync && sudo umount -f /var/lib/ceph/osd/ceph-0 2026-03-25T15:45:42.771 INFO:tasks.ceph:Unmounting /var/lib/ceph/osd/ceph-1 on ubuntu@vm04.local 2026-03-25T15:45:42.772 DEBUG:teuthology.orchestra.run.vm04:> sync && sudo umount -f /var/lib/ceph/osd/ceph-1 2026-03-25T15:45:42.847 INFO:tasks.ceph:Unmounting /var/lib/ceph/osd/ceph-2 on ubuntu@vm04.local 2026-03-25T15:45:42.847 DEBUG:teuthology.orchestra.run.vm04:> sync && sudo umount -f /var/lib/ceph/osd/ceph-2 2026-03-25T15:45:42.917 INFO:tasks.ceph:Archiving mon data... 2026-03-25T15:45:42.917 DEBUG:teuthology.misc:Transferring archived files from vm04:/var/lib/ceph/mon/ceph-a to /archive/kyr-2026-03-20_22:04:26-rbd-tentacle-none-default-vps/3645/data/mon.a.tgz 2026-03-25T15:45:42.917 DEBUG:teuthology.orchestra.run.vm04:> mktemp 2026-03-25T15:45:42.932 INFO:teuthology.orchestra.run.vm04.stdout:/tmp/tmp.75IcDplel4 2026-03-25T15:45:42.933 DEBUG:teuthology.orchestra.run.vm04:> sudo tar cz -f - -C /var/lib/ceph/mon/ceph-a -- . > /tmp/tmp.75IcDplel4 2026-03-25T15:45:43.068 DEBUG:teuthology.orchestra.run.vm04:> sudo chmod 0666 /tmp/tmp.75IcDplel4 2026-03-25T15:45:43.149 DEBUG:teuthology.orchestra.remote:vm04:/tmp/tmp.75IcDplel4 is 434KB 2026-03-25T15:45:43.207 DEBUG:teuthology.orchestra.run.vm04:> rm -fr /tmp/tmp.75IcDplel4 2026-03-25T15:45:43.222 INFO:tasks.ceph:Cleaning ceph cluster... 2026-03-25T15:45:43.222 DEBUG:teuthology.orchestra.run.vm04:> sudo rm -rf -- /etc/ceph/ceph.conf /etc/ceph/ceph.keyring /home/ubuntu/cephtest/ceph.data /home/ubuntu/cephtest/ceph.monmap /home/ubuntu/cephtest/../*.pid 2026-03-25T15:45:43.343 INFO:teuthology.util.scanner:summary_data or yaml_file is empty! 2026-03-25T15:45:43.343 INFO:tasks.ceph:Archiving crash dumps... 2026-03-25T15:45:43.343 DEBUG:teuthology.misc:Transferring archived files from vm04:/var/lib/ceph/crash to /archive/kyr-2026-03-20_22:04:26-rbd-tentacle-none-default-vps/3645/remote/vm04/crash 2026-03-25T15:45:43.343 DEBUG:teuthology.orchestra.run.vm04:> sudo tar c -f - -C /var/lib/ceph/crash -- . 2026-03-25T15:45:43.411 INFO:tasks.ceph:Compressing logs... 2026-03-25T15:45:43.411 DEBUG:teuthology.orchestra.run.vm04:> time sudo find /var/log/ceph -name '*.log' -print0 | sudo xargs --max-args=1 --max-procs=0 --verbose -0 --no-run-if-empty -- gzip -5 --verbose -- 2026-03-25T15:45:43.437 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph.tmp-client.admin.51700.log 2026-03-25T15:45:43.437 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-osd.0.log 2026-03-25T15:45:43.437 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-osd.1.log 2026-03-25T15:45:43.438 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph.tmp-client.admin.51700.log: 0.0% -- replaced with /var/log/ceph/ceph.tmp-client.admin.51700.log.gz 2026-03-25T15:45:43.438 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-osd.0.log: gzip -5 --verbose -- /var/log/ceph/ceph-osd.2.log 2026-03-25T15:45:43.438 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-mon.a.log 2026-03-25T15:45:43.438 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-osd.1.log: /var/log/ceph/ceph-osd.2.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56333.log 2026-03-25T15:45:43.449 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-mon.a.log: gzip -5 --verbose -- /var/log/ceph/ceph.log 2026-03-25T15:45:43.449 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56333.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56333.log.gz 2026-03-25T15:45:43.456 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-mgr.x.log 2026-03-25T15:45:43.457 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph.log: 86.9% -- replaced with /var/log/ceph/ceph.log.gz 2026-03-25T15:45:43.472 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56440.log 2026-03-25T15:45:43.482 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-mgr.x.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56573.log 2026-03-25T15:45:43.482 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56440.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56440.log.gz 2026-03-25T15:45:43.489 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56620.log 2026-03-25T15:45:43.489 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56573.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56573.log.gz 2026-03-25T15:45:43.499 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56667.log 2026-03-25T15:45:43.499 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56620.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56620.log.gz 2026-03-25T15:45:43.508 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph.audit.log 2026-03-25T15:45:43.508 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56667.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56667.log.gz 2026-03-25T15:45:43.518 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56792.log 2026-03-25T15:45:43.520 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph.audit.log: 89.7% -- replaced with /var/log/ceph/ceph.audit.log.gz 2026-03-25T15:45:43.528 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58029.log 2026-03-25T15:45:43.528 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.56792.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56792.log.gz 2026-03-25T15:45:43.538 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59360.log 2026-03-25T15:45:43.538 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.58029.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58029.log.gz 2026-03-25T15:45:43.545 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59440.log 2026-03-25T15:45:43.545 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59360.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59360.log.gz 2026-03-25T15:45:43.555 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59465.log 2026-03-25T15:45:43.555 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59440.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59440.log.gz 2026-03-25T15:45:43.562 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59530.log 2026-03-25T15:45:43.562 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59465.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59465.log.gz 2026-03-25T15:45:43.572 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59579.log 2026-03-25T15:45:43.572 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59530.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59530.log.gz 2026-03-25T15:45:43.580 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59628.log 2026-03-25T15:45:43.580 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59579.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59579.log.gz 2026-03-25T15:45:43.590 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59677.log 2026-03-25T15:45:43.590 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59628.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59628.log.gz 2026-03-25T15:45:43.598 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59779.log 2026-03-25T15:45:43.598 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59677.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59677.log.gz 2026-03-25T15:45:43.608 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59778.log 2026-03-25T15:45:43.608 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59779.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59779.log.gz 2026-03-25T15:45:43.616 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59780.log 2026-03-25T15:45:43.616 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59778.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59778.log.gz 2026-03-25T15:45:43.625 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59873.log 2026-03-25T15:45:43.625 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59780.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59780.log.gz 2026-03-25T15:45:43.632 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59943.log 2026-03-25T15:45:43.632 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59873.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59873.log.gz 2026-03-25T15:45:43.642 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59945.log 2026-03-25T15:45:43.642 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59943.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59943.log.gz 2026-03-25T15:45:43.650 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60020.log 2026-03-25T15:45:43.650 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.59945.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59945.log.gz 2026-03-25T15:45:43.660 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60092.log 2026-03-25T15:45:43.660 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60020.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60020.log.gz 2026-03-25T15:45:43.669 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60094.log 2026-03-25T15:45:43.669 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60092.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60092.log.gz 2026-03-25T15:45:43.676 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60167.log 2026-03-25T15:45:43.676 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60094.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60094.log.gz 2026-03-25T15:45:43.684 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60204.log 2026-03-25T15:45:43.684 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60167.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60167.log.gz 2026-03-25T15:45:43.694 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60241.log 2026-03-25T15:45:43.694 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60204.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60204.log.gz 2026-03-25T15:45:43.704 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60314.log 2026-03-25T15:45:43.704 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60241.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60241.log.gz 2026-03-25T15:45:43.712 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60363.log 2026-03-25T15:45:43.712 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60314.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60314.log.gz 2026-03-25T15:45:43.722 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60412.log 2026-03-25T15:45:43.722 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60363.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60363.log.gz 2026-03-25T15:45:43.731 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60677.log 2026-03-25T15:45:43.731 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60412.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60412.log.gz 2026-03-25T15:45:43.741 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60696.log 2026-03-25T15:45:43.741 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60677.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60677.log.gz 2026-03-25T15:45:43.750 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60724.log 2026-03-25T15:45:43.750 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60696.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60696.log.gz 2026-03-25T15:45:43.760 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60745.log 2026-03-25T15:45:43.760 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60724.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60724.log.gz 2026-03-25T15:45:43.767 INFO:teuthology.orchestra.run.vm04.stderr: 92.1% -- replaced with /var/log/ceph/ceph-mgr.x.log.gz 2026-03-25T15:45:43.769 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60767.log 2026-03-25T15:45:43.769 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60745.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60745.log.gz 2026-03-25T15:45:43.783 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60788.log 2026-03-25T15:45:43.783 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60767.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60767.log.gz 2026-03-25T15:45:43.797 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60810.log 2026-03-25T15:45:43.797 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60788.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60788.log.gz 2026-03-25T15:45:43.811 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60831.log 2026-03-25T15:45:43.811 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60810.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60810.log.gz 2026-03-25T15:45:43.826 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60853.log 2026-03-25T15:45:43.826 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60831.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60831.log.gz 2026-03-25T15:45:43.839 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60874.log 2026-03-25T15:45:43.839 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60853.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60853.log.gz 2026-03-25T15:45:43.846 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60896.log 2026-03-25T15:45:43.846 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60874.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60874.log.gz 2026-03-25T15:45:43.856 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60917.log 2026-03-25T15:45:43.856 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60896.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60896.log.gz 2026-03-25T15:45:43.863 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60939.log 2026-03-25T15:45:43.863 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60917.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60917.log.gz 2026-03-25T15:45:43.873 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60960.log 2026-03-25T15:45:43.873 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60939.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60939.log.gz 2026-03-25T15:45:43.880 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60982.log 2026-03-25T15:45:43.880 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60960.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60960.log.gz 2026-03-25T15:45:43.890 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61003.log 2026-03-25T15:45:43.890 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.60982.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60982.log.gz 2026-03-25T15:45:43.897 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61025.log 2026-03-25T15:45:43.897 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61003.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61003.log.gz 2026-03-25T15:45:43.907 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61046.log 2026-03-25T15:45:43.907 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61025.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61025.log.gz 2026-03-25T15:45:43.914 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61068.log 2026-03-25T15:45:43.914 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61046.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61046.log.gz 2026-03-25T15:45:43.925 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61089.log 2026-03-25T15:45:43.925 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61068.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61068.log.gz 2026-03-25T15:45:43.932 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61111.log 2026-03-25T15:45:43.932 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61089.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61089.log.gz 2026-03-25T15:45:43.942 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61132.log 2026-03-25T15:45:43.942 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61111.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61111.log.gz 2026-03-25T15:45:43.949 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61154.log 2026-03-25T15:45:43.949 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61132.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61132.log.gz 2026-03-25T15:45:43.959 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61175.log 2026-03-25T15:45:43.959 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61154.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61154.log.gz 2026-03-25T15:45:43.966 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61197.log 2026-03-25T15:45:43.966 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61175.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61175.log.gz 2026-03-25T15:45:43.976 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61218.log 2026-03-25T15:45:43.976 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61197.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61197.log.gz 2026-03-25T15:45:43.983 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61240.log 2026-03-25T15:45:43.983 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61218.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61218.log.gz 2026-03-25T15:45:43.993 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61261.log 2026-03-25T15:45:43.993 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61240.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61240.log.gz 2026-03-25T15:45:44.000 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61283.log 2026-03-25T15:45:44.000 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61261.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61261.log.gz 2026-03-25T15:45:44.010 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61304.log 2026-03-25T15:45:44.010 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61283.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61283.log.gz 2026-03-25T15:45:44.017 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61326.log 2026-03-25T15:45:44.017 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61304.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61304.log.gz 2026-03-25T15:45:44.027 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61347.log 2026-03-25T15:45:44.027 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61326.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61326.log.gz 2026-03-25T15:45:44.034 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61369.log 2026-03-25T15:45:44.034 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61347.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61347.log.gz 2026-03-25T15:45:44.044 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61390.log 2026-03-25T15:45:44.044 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61369.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61369.log.gz 2026-03-25T15:45:44.051 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61412.log 2026-03-25T15:45:44.051 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61390.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61390.log.gz 2026-03-25T15:45:44.061 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61433.log 2026-03-25T15:45:44.061 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61412.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61412.log.gz 2026-03-25T15:45:44.068 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61455.log 2026-03-25T15:45:44.068 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61433.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61433.log.gz 2026-03-25T15:45:44.076 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61476.log 2026-03-25T15:45:44.076 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61455.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61455.log.gz 2026-03-25T15:45:44.084 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61497.log 2026-03-25T15:45:44.084 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61476.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61476.log.gz 2026-03-25T15:45:44.091 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61521.log 2026-03-25T15:45:44.091 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61497.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61497.log.gz 2026-03-25T15:45:44.101 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61547.log 2026-03-25T15:45:44.101 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61521.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61521.log.gz 2026-03-25T15:45:44.108 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61565.log 2026-03-25T15:45:44.108 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61547.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61547.log.gz 2026-03-25T15:45:44.118 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61584.log 2026-03-25T15:45:44.118 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61565.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61565.log.gz 2026-03-25T15:45:44.125 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61601.log 2026-03-25T15:45:44.134 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61584.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61619.log 2026-03-25T15:45:44.134 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61601.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61601.log.gz 2026-03-25T15:45:44.135 INFO:teuthology.orchestra.run.vm04.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.61584.log.gz 2026-03-25T15:45:44.145 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61639.log 2026-03-25T15:45:44.145 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61619.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.61619.log.gz 2026-03-25T15:45:44.152 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61657.log 2026-03-25T15:45:44.152 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61639.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61639.log.gz 2026-03-25T15:45:44.161 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61677.log 2026-03-25T15:45:44.165 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61657.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.61657.log.gz 2026-03-25T15:45:44.169 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61695.log 2026-03-25T15:45:44.169 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61677.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61677.log.gz 2026-03-25T15:45:44.176 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61715.log 2026-03-25T15:45:44.179 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61695.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.61695.log.gz 2026-03-25T15:45:44.183 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61733.log 2026-03-25T15:45:44.183 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61715.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61715.log.gz 2026-03-25T15:45:44.196 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61735.log 2026-03-25T15:45:44.196 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61733.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61733.log.gz 2026-03-25T15:45:44.203 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61737.log 2026-03-25T15:45:44.203 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61735.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61735.log.gz 2026-03-25T15:45:44.213 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61739.log 2026-03-25T15:45:44.213 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61737.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61737.log.gz 2026-03-25T15:45:44.220 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61741.log 2026-03-25T15:45:44.220 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61739.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61739.log.gz 2026-03-25T15:45:44.230 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61760.log 2026-03-25T15:45:44.230 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61741.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61741.log.gz 2026-03-25T15:45:44.237 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61780.log 2026-03-25T15:45:44.247 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61760.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.61760.log.gz 2026-03-25T15:45:44.247 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61798.log 2026-03-25T15:45:44.247 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61780.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61780.log.gz 2026-03-25T15:45:44.253 INFO:teuthology.orchestra.run.vm04.stderr: 92.0% -- replaced with /var/log/ceph/ceph-mon.a.log.gz 2026-03-25T15:45:44.254 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61818.log 2026-03-25T15:45:44.254 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61798.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.61798.log.gz 2026-03-25T15:45:44.254 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61836.log 2026-03-25T15:45:44.254 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61818.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61818.log.gz 2026-03-25T15:45:44.255 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61856.log 2026-03-25T15:45:44.255 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61836.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.61836.log.gz 2026-03-25T15:45:44.255 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61874.log 2026-03-25T15:45:44.256 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61856.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61856.log.gz 2026-03-25T15:45:44.256 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61894.log 2026-03-25T15:45:44.256 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61874.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.61874.log.gz 2026-03-25T15:45:44.256 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61912.log 2026-03-25T15:45:44.256 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61894.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61894.log.gz 2026-03-25T15:45:44.257 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61932.log 2026-03-25T15:45:44.257 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61912.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.61912.log.gz 2026-03-25T15:45:44.257 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61950.log 2026-03-25T15:45:44.257 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61932.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61932.log.gz 2026-03-25T15:45:44.258 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61970.log 2026-03-25T15:45:44.258 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61950.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.61950.log.gz 2026-03-25T15:45:44.258 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61988.log 2026-03-25T15:45:44.258 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61970.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61970.log.gz 2026-03-25T15:45:44.259 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62008.log 2026-03-25T15:45:44.259 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.61988.log: 2.5% -- replaced with /var/log/ceph/ceph-client.admin.61988.log.gz 2026-03-25T15:45:44.259 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62026.log 2026-03-25T15:45:44.259 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62008.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62008.log.gz 2026-03-25T15:45:44.260 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62046.log 2026-03-25T15:45:44.260 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62026.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.62026.log.gz 2026-03-25T15:45:44.260 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62064.log 2026-03-25T15:45:44.260 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62046.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62046.log.gz 2026-03-25T15:45:44.261 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62084.log 2026-03-25T15:45:44.261 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62064.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.62064.log.gz 2026-03-25T15:45:44.261 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62102.log 2026-03-25T15:45:44.261 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62084.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62084.log.gz 2026-03-25T15:45:44.262 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62122.log 2026-03-25T15:45:44.262 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62102.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.62102.log.gz 2026-03-25T15:45:44.262 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62140.log 2026-03-25T15:45:44.262 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62122.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62122.log.gz 2026-03-25T15:45:44.263 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62160.log 2026-03-25T15:45:44.263 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62140.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62140.log.gz 2026-03-25T15:45:44.263 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62178.log 2026-03-25T15:45:44.263 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62160.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62160.log.gz 2026-03-25T15:45:44.264 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62196.log 2026-03-25T15:45:44.264 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62178.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62178.log.gz 2026-03-25T15:45:44.264 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62216.log 2026-03-25T15:45:44.264 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62196.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62196.log.gz 2026-03-25T15:45:44.265 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62234.log 2026-03-25T15:45:44.265 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62216.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62216.log.gz 2026-03-25T15:45:44.265 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62254.log 2026-03-25T15:45:44.265 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62234.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62234.log.gz 2026-03-25T15:45:44.266 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62272.log 2026-03-25T15:45:44.266 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62254.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62254.log.gz 2026-03-25T15:45:44.266 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62292.log 2026-03-25T15:45:44.266 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62272.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62272.log.gz 2026-03-25T15:45:44.267 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62310.log 2026-03-25T15:45:44.267 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62292.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62292.log.gz 2026-03-25T15:45:44.267 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62329.log 2026-03-25T15:45:44.267 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62310.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62310.log.gz 2026-03-25T15:45:44.268 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62353.log 2026-03-25T15:45:44.268 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62329.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62329.log.gz 2026-03-25T15:45:44.268 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62373.log 2026-03-25T15:45:44.268 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62353.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62353.log.gz 2026-03-25T15:45:44.268 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62393.log 2026-03-25T15:45:44.269 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62373.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62373.log.gz 2026-03-25T15:45:44.269 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62414.log 2026-03-25T15:45:44.269 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62393.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62393.log.gz 2026-03-25T15:45:44.270 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62435.log 2026-03-25T15:45:44.270 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62414.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62414.log.gz 2026-03-25T15:45:44.270 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62455.log 2026-03-25T15:45:44.271 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62435.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62435.log.gz 2026-03-25T15:45:44.271 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62476.log 2026-03-25T15:45:44.271 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62455.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62455.log.gz 2026-03-25T15:45:44.271 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62497.log 2026-03-25T15:45:44.272 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62476.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62476.log.gz 2026-03-25T15:45:44.272 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62519.log 2026-03-25T15:45:44.272 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62497.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62497.log.gz 2026-03-25T15:45:44.273 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62540.log 2026-03-25T15:45:44.273 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62519.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62519.log.gz 2026-03-25T15:45:44.273 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62562.log 2026-03-25T15:45:44.273 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62540.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62540.log.gz 2026-03-25T15:45:44.274 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62583.log 2026-03-25T15:45:44.274 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62562.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62562.log.gz 2026-03-25T15:45:44.274 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62605.log 2026-03-25T15:45:44.275 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62583.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62583.log.gz 2026-03-25T15:45:44.275 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62626.log 2026-03-25T15:45:44.275 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62605.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62605.log.gz 2026-03-25T15:45:44.275 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62648.log 2026-03-25T15:45:44.276 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62626.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62626.log.gz 2026-03-25T15:45:44.276 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62669.log 2026-03-25T15:45:44.276 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62648.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62648.log.gz 2026-03-25T15:45:44.277 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62691.log 2026-03-25T15:45:44.277 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62669.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62669.log.gz 2026-03-25T15:45:44.277 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62712.log 2026-03-25T15:45:44.277 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62691.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62691.log.gz 2026-03-25T15:45:44.278 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62734.log 2026-03-25T15:45:44.278 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62712.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62712.log.gz 2026-03-25T15:45:44.278 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62755.log 2026-03-25T15:45:44.278 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62734.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62734.log.gz 2026-03-25T15:45:44.278 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62777.log 2026-03-25T15:45:44.279 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62755.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62755.log.gz 2026-03-25T15:45:44.279 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62798.log 2026-03-25T15:45:44.279 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62777.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62777.log.gz 2026-03-25T15:45:44.280 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62820.log 2026-03-25T15:45:44.280 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62798.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62798.log.gz 2026-03-25T15:45:44.280 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62841.log 2026-03-25T15:45:44.280 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62820.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62820.log.gz 2026-03-25T15:45:44.281 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62863.log 2026-03-25T15:45:44.281 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62841.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62841.log.gz 2026-03-25T15:45:44.281 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62884.log 2026-03-25T15:45:44.281 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62863.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62863.log.gz 2026-03-25T15:45:44.282 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62906.log 2026-03-25T15:45:44.282 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62884.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62884.log.gz 2026-03-25T15:45:44.282 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62927.log 2026-03-25T15:45:44.282 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62906.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62906.log.gz 2026-03-25T15:45:44.282 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62949.log 2026-03-25T15:45:44.283 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62927.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62927.log.gz 2026-03-25T15:45:44.283 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62970.log 2026-03-25T15:45:44.283 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62949.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62949.log.gz 2026-03-25T15:45:44.283 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62992.log 2026-03-25T15:45:44.284 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62970.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62970.log.gz 2026-03-25T15:45:44.284 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63013.log 2026-03-25T15:45:44.284 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.62992.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62992.log.gz 2026-03-25T15:45:44.284 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63035.log 2026-03-25T15:45:44.285 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63013.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63013.log.gz 2026-03-25T15:45:44.285 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63056.log 2026-03-25T15:45:44.285 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63035.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63035.log.gz 2026-03-25T15:45:44.285 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63078.log 2026-03-25T15:45:44.286 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63056.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63056.log.gz 2026-03-25T15:45:44.286 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63099.log 2026-03-25T15:45:44.286 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63078.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63078.log.gz 2026-03-25T15:45:44.286 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63121.log 2026-03-25T15:45:44.287 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63099.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63099.log.gz 2026-03-25T15:45:44.287 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63142.log 2026-03-25T15:45:44.287 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63121.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63121.log.gz 2026-03-25T15:45:44.287 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63164.log 2026-03-25T15:45:44.288 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63142.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63142.log.gz 2026-03-25T15:45:44.288 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63185.log 2026-03-25T15:45:44.288 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63164.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63164.log.gz 2026-03-25T15:45:44.288 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63207.log 2026-03-25T15:45:44.289 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63185.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63185.log.gz 2026-03-25T15:45:44.289 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63228.log 2026-03-25T15:45:44.289 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63207.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63207.log.gz 2026-03-25T15:45:44.289 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63249.log 2026-03-25T15:45:44.290 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63228.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63228.log.gz 2026-03-25T15:45:44.290 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63266.log 2026-03-25T15:45:44.290 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63249.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.63249.log.gz 2026-03-25T15:45:44.290 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63283.log 2026-03-25T15:45:44.291 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63266.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63266.log.gz 2026-03-25T15:45:44.291 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63303.log 2026-03-25T15:45:44.291 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63283.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63283.log.gz 2026-03-25T15:45:44.291 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63324.log 2026-03-25T15:45:44.292 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63303.log: 4.3% -- replaced with /var/log/ceph/ceph-client.admin.63303.log.gz 2026-03-25T15:45:44.292 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63344.log 2026-03-25T15:45:44.292 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63324.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63324.log.gz 2026-03-25T15:45:44.292 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63365.log 2026-03-25T15:45:44.293 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63344.log: 5.3% -- replaced with /var/log/ceph/ceph-client.admin.63344.log.gz 2026-03-25T15:45:44.293 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63391.log 2026-03-25T15:45:44.293 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63365.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63365.log.gz 2026-03-25T15:45:44.293 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63408.log 2026-03-25T15:45:44.294 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63391.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63391.log.gz 2026-03-25T15:45:44.294 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63425.log 2026-03-25T15:45:44.294 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63408.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.63408.log.gz 2026-03-25T15:45:44.295 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63445.log 2026-03-25T15:45:44.295 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63425.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63425.log.gz 2026-03-25T15:45:44.295 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63463.log 2026-03-25T15:45:44.295 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63445.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63445.log.gz 2026-03-25T15:45:44.296 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63483.log 2026-03-25T15:45:44.296 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63463.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63463.log.gz 2026-03-25T15:45:44.296 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63503.log 2026-03-25T15:45:44.296 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63483.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63483.log.gz 2026-03-25T15:45:44.297 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63505.log 2026-03-25T15:45:44.297 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63503.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63503.log.gz 2026-03-25T15:45:44.297 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63525.log 2026-03-25T15:45:44.297 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63505.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63505.log.gz 2026-03-25T15:45:44.298 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63543.log 2026-03-25T15:45:44.298 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63525.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63525.log.gz 2026-03-25T15:45:44.298 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63568.log 2026-03-25T15:45:44.299 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63543.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63543.log.gz 2026-03-25T15:45:44.299 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63589.log 2026-03-25T15:45:44.299 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63568.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63568.log.gz 2026-03-25T15:45:44.300 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63611.log 2026-03-25T15:45:44.300 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63589.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63589.log.gz 2026-03-25T15:45:44.300 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63632.log 2026-03-25T15:45:44.301 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63611.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63611.log.gz 2026-03-25T15:45:44.301 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63654.log 2026-03-25T15:45:44.301 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63632.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63632.log.gz 2026-03-25T15:45:44.301 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63675.log 2026-03-25T15:45:44.302 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63654.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63654.log.gz 2026-03-25T15:45:44.302 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63697.log 2026-03-25T15:45:44.302 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63675.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63675.log.gz 2026-03-25T15:45:44.303 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63718.log 2026-03-25T15:45:44.303 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63697.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63697.log.gz 2026-03-25T15:45:44.303 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63740.log 2026-03-25T15:45:44.303 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63718.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63718.log.gz 2026-03-25T15:45:44.304 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63761.log 2026-03-25T15:45:44.304 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63740.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63740.log.gz 2026-03-25T15:45:44.304 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63783.log 2026-03-25T15:45:44.305 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63761.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63761.log.gz 2026-03-25T15:45:44.305 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63804.log 2026-03-25T15:45:44.305 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63783.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63783.log.gz 2026-03-25T15:45:44.305 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63826.log 2026-03-25T15:45:44.306 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63804.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63804.log.gz 2026-03-25T15:45:44.306 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63847.log 2026-03-25T15:45:44.306 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63826.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63826.log.gz 2026-03-25T15:45:44.307 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63869.log 2026-03-25T15:45:44.307 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63847.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63847.log.gz 2026-03-25T15:45:44.307 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63890.log 2026-03-25T15:45:44.308 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63869.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63869.log.gz 2026-03-25T15:45:44.308 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63912.log 2026-03-25T15:45:44.308 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63890.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63890.log.gz 2026-03-25T15:45:44.308 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63933.log 2026-03-25T15:45:44.309 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63912.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63912.log.gz 2026-03-25T15:45:44.309 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63955.log 2026-03-25T15:45:44.309 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63933.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63933.log.gz 2026-03-25T15:45:44.310 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63976.log 2026-03-25T15:45:44.310 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63955.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63955.log.gz 2026-03-25T15:45:44.310 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63998.log 2026-03-25T15:45:44.310 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63976.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63976.log.gz 2026-03-25T15:45:44.311 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64019.log 2026-03-25T15:45:44.311 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.63998.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63998.log.gz 2026-03-25T15:45:44.311 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64040.log 2026-03-25T15:45:44.311 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64019.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64019.log.gz 2026-03-25T15:45:44.312 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64061.log 2026-03-25T15:45:44.312 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64040.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64040.log.gz 2026-03-25T15:45:44.312 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64083.log 2026-03-25T15:45:44.313 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64061.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64061.log.gz 2026-03-25T15:45:44.313 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64104.log 2026-03-25T15:45:44.313 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64083.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64083.log.gz 2026-03-25T15:45:44.313 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64126.log 2026-03-25T15:45:44.314 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64104.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64104.log.gz 2026-03-25T15:45:44.314 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64147.log 2026-03-25T15:45:44.314 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64126.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64126.log.gz 2026-03-25T15:45:44.315 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64169.log 2026-03-25T15:45:44.315 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64147.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64147.log.gz 2026-03-25T15:45:44.315 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64190.log 2026-03-25T15:45:44.315 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64169.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64169.log.gz 2026-03-25T15:45:44.316 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64212.log 2026-03-25T15:45:44.316 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64190.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64190.log.gz 2026-03-25T15:45:44.316 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64233.log 2026-03-25T15:45:44.316 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64212.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64212.log.gz 2026-03-25T15:45:44.317 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64255.log 2026-03-25T15:45:44.317 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64233.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64233.log.gz 2026-03-25T15:45:44.317 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64276.log 2026-03-25T15:45:44.317 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64255.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64255.log.gz 2026-03-25T15:45:44.318 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64298.log 2026-03-25T15:45:44.318 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64276.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64276.log.gz 2026-03-25T15:45:44.318 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64319.log 2026-03-25T15:45:44.319 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64298.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64298.log.gz 2026-03-25T15:45:44.319 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64341.log 2026-03-25T15:45:44.319 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64319.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64319.log.gz 2026-03-25T15:45:44.319 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64362.log 2026-03-25T15:45:44.320 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64341.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64341.log.gz 2026-03-25T15:45:44.320 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64384.log 2026-03-25T15:45:44.320 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64362.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64362.log.gz 2026-03-25T15:45:44.320 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64405.log 2026-03-25T15:45:44.321 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64384.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64384.log.gz 2026-03-25T15:45:44.321 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64427.log 2026-03-25T15:45:44.321 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64405.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64405.log.gz 2026-03-25T15:45:44.321 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64448.log 2026-03-25T15:45:44.322 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64427.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64427.log.gz 2026-03-25T15:45:44.322 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64470.log 2026-03-25T15:45:44.322 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64448.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64448.log.gz 2026-03-25T15:45:44.323 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64491.log 2026-03-25T15:45:44.323 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64470.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64470.log.gz 2026-03-25T15:45:44.323 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64513.log 2026-03-25T15:45:44.323 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64491.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64491.log.gz 2026-03-25T15:45:44.324 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64534.log 2026-03-25T15:45:44.324 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64513.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64513.log.gz 2026-03-25T15:45:44.324 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64556.log 2026-03-25T15:45:44.324 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64534.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64534.log.gz 2026-03-25T15:45:44.325 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64577.log 2026-03-25T15:45:44.325 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64556.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64556.log.gz 2026-03-25T15:45:44.325 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64599.log 2026-03-25T15:45:44.326 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64577.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64577.log.gz 2026-03-25T15:45:44.326 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64620.log 2026-03-25T15:45:44.326 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64599.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64599.log.gz 2026-03-25T15:45:44.326 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64642.log 2026-03-25T15:45:44.327 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64620.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64620.log.gz 2026-03-25T15:45:44.327 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64663.log 2026-03-25T15:45:44.327 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64642.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64642.log.gz 2026-03-25T15:45:44.327 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64685.log 2026-03-25T15:45:44.328 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64663.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64663.log.gz 2026-03-25T15:45:44.328 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64706.log 2026-03-25T15:45:44.328 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64685.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64685.log.gz 2026-03-25T15:45:44.328 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64728.log 2026-03-25T15:45:44.329 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64706.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64706.log.gz 2026-03-25T15:45:44.329 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64749.log 2026-03-25T15:45:44.329 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64728.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64728.log.gz 2026-03-25T15:45:44.330 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64771.log 2026-03-25T15:45:44.330 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64749.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64749.log.gz 2026-03-25T15:45:44.330 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64792.log 2026-03-25T15:45:44.330 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64771.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64771.log.gz 2026-03-25T15:45:44.331 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64814.log 2026-03-25T15:45:44.331 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64792.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64792.log.gz 2026-03-25T15:45:44.331 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64835.log 2026-03-25T15:45:44.331 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64814.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64814.log.gz 2026-03-25T15:45:44.332 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64857.log 2026-03-25T15:45:44.332 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64835.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64835.log.gz 2026-03-25T15:45:44.332 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64878.log 2026-03-25T15:45:44.332 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64857.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64857.log.gz 2026-03-25T15:45:44.333 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64900.log 2026-03-25T15:45:44.333 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64878.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64878.log.gz 2026-03-25T15:45:44.333 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64921.log 2026-03-25T15:45:44.333 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64900.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64900.log.gz 2026-03-25T15:45:44.334 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64943.log 2026-03-25T15:45:44.334 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64921.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64921.log.gz 2026-03-25T15:45:44.334 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64964.log 2026-03-25T15:45:44.334 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64943.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64943.log.gz 2026-03-25T15:45:44.335 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64986.log 2026-03-25T15:45:44.335 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64964.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64964.log.gz 2026-03-25T15:45:44.335 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65007.log 2026-03-25T15:45:44.336 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.64986.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64986.log.gz 2026-03-25T15:45:44.336 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65029.log 2026-03-25T15:45:44.336 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65007.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65007.log.gz 2026-03-25T15:45:44.336 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65050.log 2026-03-25T15:45:44.337 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65029.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65029.log.gz 2026-03-25T15:45:44.337 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65072.log 2026-03-25T15:45:44.337 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65050.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65050.log.gz 2026-03-25T15:45:44.337 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65093.log 2026-03-25T15:45:44.338 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65072.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65072.log.gz 2026-03-25T15:45:44.338 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65114.log 2026-03-25T15:45:44.338 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65093.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65093.log.gz 2026-03-25T15:45:44.338 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65131.log 2026-03-25T15:45:44.339 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65114.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.65114.log.gz 2026-03-25T15:45:44.339 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65148.log 2026-03-25T15:45:44.339 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65131.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.65131.log.gz 2026-03-25T15:45:44.340 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65166.log 2026-03-25T15:45:44.340 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65148.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65148.log.gz 2026-03-25T15:45:44.340 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65184.log 2026-03-25T15:45:44.340 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65166.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65166.log.gz 2026-03-25T15:45:44.341 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65203.log 2026-03-25T15:45:44.341 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65184.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65184.log.gz 2026-03-25T15:45:44.341 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65224.log 2026-03-25T15:45:44.342 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65203.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65203.log.gz 2026-03-25T15:45:44.342 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65245.log 2026-03-25T15:45:44.342 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65224.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65224.log.gz 2026-03-25T15:45:44.342 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65265.log 2026-03-25T15:45:44.343 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65245.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65245.log.gz 2026-03-25T15:45:44.343 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65285.log 2026-03-25T15:45:44.343 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65265.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65265.log.gz 2026-03-25T15:45:44.344 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65302.log 2026-03-25T15:45:44.344 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65285.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65285.log.gz 2026-03-25T15:45:44.344 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65319.log 2026-03-25T15:45:44.344 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65302.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65302.log.gz 2026-03-25T15:45:44.345 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65337.log 2026-03-25T15:45:44.345 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65319.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65319.log.gz 2026-03-25T15:45:44.345 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65355.log 2026-03-25T15:45:44.345 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65337.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65337.log.gz 2026-03-25T15:45:44.346 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65374.log 2026-03-25T15:45:44.346 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65355.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65355.log.gz 2026-03-25T15:45:44.346 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65395.log 2026-03-25T15:45:44.346 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65374.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65374.log.gz 2026-03-25T15:45:44.347 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65416.log 2026-03-25T15:45:44.347 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65395.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65395.log.gz 2026-03-25T15:45:44.347 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65437.log 2026-03-25T15:45:44.348 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65416.log: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.65416.log.gz 2026-03-25T15:45:44.348 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65458.log 2026-03-25T15:45:44.348 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65437.log: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.65437.log.gz 2026-03-25T15:45:44.348 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65475.log 2026-03-25T15:45:44.349 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65458.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65458.log.gz 2026-03-25T15:45:44.349 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65492.log 2026-03-25T15:45:44.349 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65475.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.65475.log.gz 2026-03-25T15:45:44.349 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65510.log 2026-03-25T15:45:44.350 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65492.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65492.log.gz 2026-03-25T15:45:44.350 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65528.log 2026-03-25T15:45:44.350 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65510.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65510.log.gz 2026-03-25T15:45:44.351 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65547.log 2026-03-25T15:45:44.351 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65528.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65528.log.gz 2026-03-25T15:45:44.351 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65568.log 2026-03-25T15:45:44.351 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65547.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65547.log.gz 2026-03-25T15:45:44.352 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65590.log 2026-03-25T15:45:44.352 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65568.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65568.log.gz 2026-03-25T15:45:44.352 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65611.log 2026-03-25T15:45:44.352 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65590.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65590.log.gz 2026-03-25T15:45:44.353 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65633.log 2026-03-25T15:45:44.353 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65611.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65611.log.gz 2026-03-25T15:45:44.353 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65654.log 2026-03-25T15:45:44.353 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65633.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65633.log.gz 2026-03-25T15:45:44.354 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65676.log 2026-03-25T15:45:44.354 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65654.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65654.log.gz 2026-03-25T15:45:44.354 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65697.log 2026-03-25T15:45:44.355 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65676.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65676.log.gz 2026-03-25T15:45:44.355 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65719.log 2026-03-25T15:45:44.355 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65697.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65697.log.gz 2026-03-25T15:45:44.355 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65740.log 2026-03-25T15:45:44.356 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65719.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65719.log.gz 2026-03-25T15:45:44.356 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65762.log 2026-03-25T15:45:44.356 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65740.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65740.log.gz 2026-03-25T15:45:44.356 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65783.log 2026-03-25T15:45:44.357 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65762.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65762.log.gz 2026-03-25T15:45:44.357 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65805.log 2026-03-25T15:45:44.357 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65783.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65783.log.gz 2026-03-25T15:45:44.357 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65826.log 2026-03-25T15:45:44.358 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65805.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65805.log.gz 2026-03-25T15:45:44.358 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65848.log 2026-03-25T15:45:44.358 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65826.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65826.log.gz 2026-03-25T15:45:44.359 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65869.log 2026-03-25T15:45:44.359 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65848.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65848.log.gz 2026-03-25T15:45:44.359 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65891.log 2026-03-25T15:45:44.359 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65869.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65869.log.gz 2026-03-25T15:45:44.360 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65912.log 2026-03-25T15:45:44.360 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65891.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65891.log.gz 2026-03-25T15:45:44.360 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65934.log 2026-03-25T15:45:44.360 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65912.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65912.log.gz 2026-03-25T15:45:44.361 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65955.log 2026-03-25T15:45:44.361 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65934.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65934.log.gz 2026-03-25T15:45:44.361 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65977.log 2026-03-25T15:45:44.362 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65955.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65955.log.gz 2026-03-25T15:45:44.362 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65998.log 2026-03-25T15:45:44.362 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65977.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65977.log.gz 2026-03-25T15:45:44.362 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66020.log 2026-03-25T15:45:44.363 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.65998.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65998.log.gz 2026-03-25T15:45:44.363 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66041.log 2026-03-25T15:45:44.363 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66020.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66020.log.gz 2026-03-25T15:45:44.364 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66063.log 2026-03-25T15:45:44.364 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66041.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66041.log.gz 2026-03-25T15:45:44.364 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66084.log 2026-03-25T15:45:44.364 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66063.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66063.log.gz 2026-03-25T15:45:44.365 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66106.log 2026-03-25T15:45:44.365 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66084.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66084.log.gz 2026-03-25T15:45:44.365 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66127.log 2026-03-25T15:45:44.365 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66106.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66106.log.gz 2026-03-25T15:45:44.366 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66149.log 2026-03-25T15:45:44.366 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66127.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66127.log.gz 2026-03-25T15:45:44.366 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66170.log 2026-03-25T15:45:44.366 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66149.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66149.log.gz 2026-03-25T15:45:44.367 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66192.log 2026-03-25T15:45:44.367 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66170.log: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.66170.log.gz 2026-03-25T15:45:44.367 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66213.log 2026-03-25T15:45:44.368 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66192.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66192.log.gz 2026-03-25T15:45:44.368 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66234.log 2026-03-25T15:45:44.368 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66213.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66213.log.gz 2026-03-25T15:45:44.368 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66255.log 2026-03-25T15:45:44.369 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66234.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66234.log.gz 2026-03-25T15:45:44.369 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66277.log 2026-03-25T15:45:44.369 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66255.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66255.log.gz 2026-03-25T15:45:44.370 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66298.log 2026-03-25T15:45:44.370 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66277.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66277.log.gz 2026-03-25T15:45:44.370 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66320.log 2026-03-25T15:45:44.370 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66298.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66298.log.gz 2026-03-25T15:45:44.371 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66341.log 2026-03-25T15:45:44.371 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66320.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66320.log.gz 2026-03-25T15:45:44.371 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66363.log 2026-03-25T15:45:44.371 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66341.log: 29.3% -- replaced with /var/log/ceph/ceph-client.admin.66341.log.gz 2026-03-25T15:45:44.372 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66380.log 2026-03-25T15:45:44.372 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66363.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.66363.log.gz 2026-03-25T15:45:44.372 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66397.log 2026-03-25T15:45:44.373 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66380.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.66380.log.gz 2026-03-25T15:45:44.373 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66414.log 2026-03-25T15:45:44.373 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66397.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66431.log 2026-03-25T15:45:44.373 INFO:teuthology.orchestra.run.vm04.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.66397.log.gz 2026-03-25T15:45:44.374 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66414.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.66414.log.gz 2026-03-25T15:45:44.374 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66448.log 2026-03-25T15:45:44.374 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66431.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.66431.log.gz 2026-03-25T15:45:44.375 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66465.log 2026-03-25T15:45:44.375 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66448.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.66448.log.gz 2026-03-25T15:45:44.375 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66482.log 2026-03-25T15:45:44.375 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66465.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.66465.log.gz 2026-03-25T15:45:44.376 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66499.log 2026-03-25T15:45:44.376 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66482.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.66482.log.gz 2026-03-25T15:45:44.376 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66516.log 2026-03-25T15:45:44.377 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66499.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.66499.log.gz 2026-03-25T15:45:44.377 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66533.log 2026-03-25T15:45:44.377 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66516.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.66516.log.gz 2026-03-25T15:45:44.377 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66550.log 2026-03-25T15:45:44.378 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66533.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.66533.log.gz 2026-03-25T15:45:44.378 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66567.log 2026-03-25T15:45:44.378 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66550.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.66550.log.gz 2026-03-25T15:45:44.378 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66584.log 2026-03-25T15:45:44.379 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66567.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.66567.log.gz 2026-03-25T15:45:44.379 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66601.log 2026-03-25T15:45:44.379 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66584.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.66584.log.gz 2026-03-25T15:45:44.380 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66618.log 2026-03-25T15:45:44.380 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66601.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.66601.log.gz 2026-03-25T15:45:44.380 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66635.log 2026-03-25T15:45:44.381 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66618.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.66618.log.gz 2026-03-25T15:45:44.381 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66652.log 2026-03-25T15:45:44.381 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66635.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.66635.log.gz 2026-03-25T15:45:44.381 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66669.log 2026-03-25T15:45:44.382 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66652.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66686.log 2026-03-25T15:45:44.382 INFO:teuthology.orchestra.run.vm04.stderr: 2.5% -- replaced with /var/log/ceph/ceph-client.admin.66652.log.gz 2026-03-25T15:45:44.382 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66669.log: 3.8% -- replaced with /var/log/ceph/ceph-client.admin.66669.log.gz 2026-03-25T15:45:44.383 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66703.log 2026-03-25T15:45:44.383 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66686.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.66686.log.gz 2026-03-25T15:45:44.383 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66720.log 2026-03-25T15:45:44.384 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66703.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.66703.log.gz 2026-03-25T15:45:44.384 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66737.log 2026-03-25T15:45:44.384 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66720.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.66720.log.gz 2026-03-25T15:45:44.384 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66754.log 2026-03-25T15:45:44.385 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66737.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.66737.log.gz 2026-03-25T15:45:44.385 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66771.log 2026-03-25T15:45:44.385 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66754.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.66754.log.gz 2026-03-25T15:45:44.385 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66788.log 2026-03-25T15:45:44.386 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66771.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.66771.log.gz 2026-03-25T15:45:44.386 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66805.log 2026-03-25T15:45:44.386 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66788.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.66788.log.gz 2026-03-25T15:45:44.387 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66822.log 2026-03-25T15:45:44.387 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66805.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.66805.log.gz 2026-03-25T15:45:44.387 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66839.log 2026-03-25T15:45:44.387 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66822.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.66822.log.gz 2026-03-25T15:45:44.388 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66856.log 2026-03-25T15:45:44.388 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66839.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.66839.log.gz 2026-03-25T15:45:44.388 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66873.log 2026-03-25T15:45:44.389 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66856.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.66856.log.gz 2026-03-25T15:45:44.389 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66890.log 2026-03-25T15:45:44.389 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66873.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.66873.log.gz 2026-03-25T15:45:44.389 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66907.log 2026-03-25T15:45:44.390 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66890.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.66890.log.gz 2026-03-25T15:45:44.390 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66924.log 2026-03-25T15:45:44.390 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66907.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.66907.log.gz 2026-03-25T15:45:44.391 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66941.log 2026-03-25T15:45:44.391 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66924.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.66924.log.gz 2026-03-25T15:45:44.391 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66958.log 2026-03-25T15:45:44.391 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66941.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.66941.log.gz 2026-03-25T15:45:44.392 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66975.log 2026-03-25T15:45:44.392 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66958.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.66958.log.gz 2026-03-25T15:45:44.392 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66992.log 2026-03-25T15:45:44.393 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66975.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.66975.log.gz 2026-03-25T15:45:44.393 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67009.log 2026-03-25T15:45:44.393 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.66992.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.66992.log.gz 2026-03-25T15:45:44.394 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67026.log 2026-03-25T15:45:44.394 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67009.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67009.log.gz 2026-03-25T15:45:44.394 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67043.log 2026-03-25T15:45:44.394 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67026.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67026.log.gz 2026-03-25T15:45:44.395 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67060.log 2026-03-25T15:45:44.395 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67043.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67043.log.gz 2026-03-25T15:45:44.395 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67077.log 2026-03-25T15:45:44.395 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67060.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67060.log.gz 2026-03-25T15:45:44.396 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67094.log 2026-03-25T15:45:44.396 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67077.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67077.log.gz 2026-03-25T15:45:44.396 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67111.log 2026-03-25T15:45:44.397 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67094.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67094.log.gz 2026-03-25T15:45:44.397 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67128.log 2026-03-25T15:45:44.397 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67111.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67111.log.gz 2026-03-25T15:45:44.397 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67145.log 2026-03-25T15:45:44.398 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67128.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67128.log.gz 2026-03-25T15:45:44.398 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67162.log 2026-03-25T15:45:44.398 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67145.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67145.log.gz 2026-03-25T15:45:44.398 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67179.log 2026-03-25T15:45:44.399 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67162.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67162.log.gz 2026-03-25T15:45:44.399 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67196.log 2026-03-25T15:45:44.399 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67179.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67179.log.gz 2026-03-25T15:45:44.399 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67213.log 2026-03-25T15:45:44.400 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67196.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67196.log.gz 2026-03-25T15:45:44.400 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67230.log 2026-03-25T15:45:44.401 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67213.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67213.log.gz 2026-03-25T15:45:44.401 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67247.log 2026-03-25T15:45:44.401 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67230.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67230.log.gz 2026-03-25T15:45:44.401 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67264.log 2026-03-25T15:45:44.402 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67247.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67247.log.gz 2026-03-25T15:45:44.402 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67281.log 2026-03-25T15:45:44.402 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67264.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67264.log.gz 2026-03-25T15:45:44.402 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67298.log 2026-03-25T15:45:44.403 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67281.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67281.log.gz 2026-03-25T15:45:44.403 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67315.log 2026-03-25T15:45:44.403 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67298.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67298.log.gz 2026-03-25T15:45:44.403 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67332.log 2026-03-25T15:45:44.404 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67315.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67315.log.gz 2026-03-25T15:45:44.404 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67349.log 2026-03-25T15:45:44.404 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67332.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67332.log.gz 2026-03-25T15:45:44.404 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67366.log 2026-03-25T15:45:44.405 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67349.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67349.log.gz 2026-03-25T15:45:44.405 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67383.log 2026-03-25T15:45:44.405 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67366.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67366.log.gz 2026-03-25T15:45:44.405 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67400.log 2026-03-25T15:45:44.406 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67383.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67383.log.gz 2026-03-25T15:45:44.406 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67417.log 2026-03-25T15:45:44.406 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67400.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67400.log.gz 2026-03-25T15:45:44.406 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67434.log 2026-03-25T15:45:44.407 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67417.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67417.log.gz 2026-03-25T15:45:44.407 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67451.log 2026-03-25T15:45:44.407 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67434.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67434.log.gz 2026-03-25T15:45:44.407 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67468.log 2026-03-25T15:45:44.408 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67451.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67451.log.gz 2026-03-25T15:45:44.408 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67485.log 2026-03-25T15:45:44.408 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67468.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67468.log.gz 2026-03-25T15:45:44.408 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67502.log 2026-03-25T15:45:44.409 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67485.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67485.log.gz 2026-03-25T15:45:44.409 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67519.log 2026-03-25T15:45:44.409 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67502.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67502.log.gz 2026-03-25T15:45:44.409 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67536.log 2026-03-25T15:45:44.410 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67519.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67519.log.gz 2026-03-25T15:45:44.410 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67553.log 2026-03-25T15:45:44.410 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67536.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67536.log.gz 2026-03-25T15:45:44.411 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67570.log 2026-03-25T15:45:44.411 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67553.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67553.log.gz 2026-03-25T15:45:44.411 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67587.log 2026-03-25T15:45:44.411 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67570.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67570.log.gz 2026-03-25T15:45:44.412 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67604.log 2026-03-25T15:45:44.412 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67587.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67587.log.gz 2026-03-25T15:45:44.412 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67621.log 2026-03-25T15:45:44.413 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67604.log: 2.5% -- replaced with /var/log/ceph/ceph-client.admin.67604.log.gz 2026-03-25T15:45:44.413 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67638.log 2026-03-25T15:45:44.413 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67621.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67621.log.gz 2026-03-25T15:45:44.413 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67655.log 2026-03-25T15:45:44.413 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67638.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67638.log.gz 2026-03-25T15:45:44.414 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67672.log 2026-03-25T15:45:44.414 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67655.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67655.log.gz 2026-03-25T15:45:44.414 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67689.log 2026-03-25T15:45:44.415 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67672.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67672.log.gz 2026-03-25T15:45:44.415 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67706.log 2026-03-25T15:45:44.415 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67689.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67689.log.gz 2026-03-25T15:45:44.415 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67723.log 2026-03-25T15:45:44.416 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67706.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67706.log.gz 2026-03-25T15:45:44.416 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67740.log 2026-03-25T15:45:44.416 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67723.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67723.log.gz 2026-03-25T15:45:44.416 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67757.log 2026-03-25T15:45:44.417 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67740.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67740.log.gz 2026-03-25T15:45:44.417 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67774.log 2026-03-25T15:45:44.417 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67757.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67757.log.gz 2026-03-25T15:45:44.418 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67791.log 2026-03-25T15:45:44.418 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67774.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67774.log.gz 2026-03-25T15:45:44.418 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67808.log 2026-03-25T15:45:44.419 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67791.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67791.log.gz 2026-03-25T15:45:44.419 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67825.log 2026-03-25T15:45:44.419 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67808.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67808.log.gz 2026-03-25T15:45:44.419 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67842.log 2026-03-25T15:45:44.420 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67825.log: 2.5% -- replaced with /var/log/ceph/ceph-client.admin.67825.log.gz 2026-03-25T15:45:44.420 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67859.log 2026-03-25T15:45:44.420 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67842.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67842.log.gz 2026-03-25T15:45:44.420 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67876.log 2026-03-25T15:45:44.421 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67859.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67859.log.gz 2026-03-25T15:45:44.421 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67893.log 2026-03-25T15:45:44.421 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67876.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67876.log.gz 2026-03-25T15:45:44.421 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67910.log 2026-03-25T15:45:44.422 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67893.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67893.log.gz 2026-03-25T15:45:44.422 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67927.log 2026-03-25T15:45:44.422 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67910.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67910.log.gz 2026-03-25T15:45:44.423 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67944.log 2026-03-25T15:45:44.423 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67927.log: 33.0% -- replaced with /var/log/ceph/ceph-client.admin.67927.log.gz 2026-03-25T15:45:44.423 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67961.log 2026-03-25T15:45:44.423 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67944.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67944.log.gz 2026-03-25T15:45:44.423 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67978.log 2026-03-25T15:45:44.424 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67961.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67961.log.gz 2026-03-25T15:45:44.424 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67995.log 2026-03-25T15:45:44.424 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67978.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67978.log.gz 2026-03-25T15:45:44.424 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68012.log 2026-03-25T15:45:44.425 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.67995.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.67995.log.gz 2026-03-25T15:45:44.425 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68029.log 2026-03-25T15:45:44.425 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68012.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.68012.log.gz 2026-03-25T15:45:44.426 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68046.log 2026-03-25T15:45:44.426 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68029.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.68029.log.gz 2026-03-25T15:45:44.426 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68063.log 2026-03-25T15:45:44.426 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68046.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.68046.log.gz 2026-03-25T15:45:44.427 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68082.log 2026-03-25T15:45:44.427 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68063.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68063.log.gz 2026-03-25T15:45:44.427 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68106.log 2026-03-25T15:45:44.427 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68082.log: 56.2% -- replaced with /var/log/ceph/ceph-client.admin.68082.log.gz 2026-03-25T15:45:44.427 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68126.log 2026-03-25T15:45:44.428 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68106.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68106.log.gz 2026-03-25T15:45:44.428 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68146.log 2026-03-25T15:45:44.428 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68126.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68126.log.gz 2026-03-25T15:45:44.428 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68166.log 2026-03-25T15:45:44.429 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68146.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68146.log.gz 2026-03-25T15:45:44.429 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68186.log 2026-03-25T15:45:44.429 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68166.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68166.log.gz 2026-03-25T15:45:44.429 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68206.log 2026-03-25T15:45:44.430 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68186.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68186.log.gz 2026-03-25T15:45:44.430 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68226.log 2026-03-25T15:45:44.430 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68206.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68206.log.gz 2026-03-25T15:45:44.430 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68246.log 2026-03-25T15:45:44.431 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68226.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68226.log.gz 2026-03-25T15:45:44.431 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68266.log 2026-03-25T15:45:44.431 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68246.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68246.log.gz 2026-03-25T15:45:44.431 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68286.log 2026-03-25T15:45:44.431 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68266.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68266.log.gz 2026-03-25T15:45:44.432 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68306.log 2026-03-25T15:45:44.432 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68286.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68286.log.gz 2026-03-25T15:45:44.432 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68326.log 2026-03-25T15:45:44.432 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68306.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68306.log.gz 2026-03-25T15:45:44.433 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68346.log 2026-03-25T15:45:44.433 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68326.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68326.log.gz 2026-03-25T15:45:44.433 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68366.log 2026-03-25T15:45:44.433 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68346.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68346.log.gz 2026-03-25T15:45:44.433 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68386.log 2026-03-25T15:45:44.434 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68366.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68366.log.gz 2026-03-25T15:45:44.434 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68406.log 2026-03-25T15:45:44.434 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68386.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68386.log.gz 2026-03-25T15:45:44.434 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68426.log 2026-03-25T15:45:44.435 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68406.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68406.log.gz 2026-03-25T15:45:44.435 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68446.log 2026-03-25T15:45:44.435 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68426.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68426.log.gz 2026-03-25T15:45:44.436 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68466.log 2026-03-25T15:45:44.436 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68446.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68446.log.gz 2026-03-25T15:45:44.436 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68486.log 2026-03-25T15:45:44.436 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68466.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68466.log.gz 2026-03-25T15:45:44.436 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68506.log 2026-03-25T15:45:44.437 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68486.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68486.log.gz 2026-03-25T15:45:44.437 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68526.log 2026-03-25T15:45:44.437 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68506.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68506.log.gz 2026-03-25T15:45:44.437 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68546.log 2026-03-25T15:45:44.438 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68526.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68526.log.gz 2026-03-25T15:45:44.438 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68566.log 2026-03-25T15:45:44.438 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68546.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68546.log.gz 2026-03-25T15:45:44.438 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68586.log 2026-03-25T15:45:44.439 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68566.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68566.log.gz 2026-03-25T15:45:44.439 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68606.log 2026-03-25T15:45:44.439 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68586.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68586.log.gz 2026-03-25T15:45:44.439 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68626.log 2026-03-25T15:45:44.439 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68606.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68606.log.gz 2026-03-25T15:45:44.440 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68646.log 2026-03-25T15:45:44.440 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68626.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68626.log.gz 2026-03-25T15:45:44.440 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68666.log 2026-03-25T15:45:44.440 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68646.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68646.log.gz 2026-03-25T15:45:44.441 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68686.log 2026-03-25T15:45:44.441 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68666.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68666.log.gz 2026-03-25T15:45:44.441 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68706.log 2026-03-25T15:45:44.441 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68686.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68686.log.gz 2026-03-25T15:45:44.442 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68726.log 2026-03-25T15:45:44.442 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68706.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68706.log.gz 2026-03-25T15:45:44.442 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68746.log 2026-03-25T15:45:44.442 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68726.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68726.log.gz 2026-03-25T15:45:44.442 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68766.log 2026-03-25T15:45:44.443 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68746.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68746.log.gz 2026-03-25T15:45:44.443 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68786.log 2026-03-25T15:45:44.443 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68766.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68766.log.gz 2026-03-25T15:45:44.443 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68806.log 2026-03-25T15:45:44.444 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68786.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68786.log.gz 2026-03-25T15:45:44.444 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68826.log 2026-03-25T15:45:44.444 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68806.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68806.log.gz 2026-03-25T15:45:44.444 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68846.log 2026-03-25T15:45:44.445 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68826.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68826.log.gz 2026-03-25T15:45:44.445 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68866.log 2026-03-25T15:45:44.445 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68846.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68846.log.gz 2026-03-25T15:45:44.445 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68886.log 2026-03-25T15:45:44.446 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68866.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68866.log.gz 2026-03-25T15:45:44.446 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68906.log 2026-03-25T15:45:44.446 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68886.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68886.log.gz 2026-03-25T15:45:44.446 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68926.log 2026-03-25T15:45:44.446 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68906.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68906.log.gz 2026-03-25T15:45:44.447 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68946.log 2026-03-25T15:45:44.447 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68926.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68926.log.gz 2026-03-25T15:45:44.447 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68966.log 2026-03-25T15:45:44.447 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68946.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68946.log.gz 2026-03-25T15:45:44.448 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68986.log 2026-03-25T15:45:44.448 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68966.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68966.log.gz 2026-03-25T15:45:44.448 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69006.log 2026-03-25T15:45:44.448 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.68986.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68986.log.gz 2026-03-25T15:45:44.449 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69026.log 2026-03-25T15:45:44.449 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69006.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69006.log.gz 2026-03-25T15:45:44.449 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69046.log 2026-03-25T15:45:44.449 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69026.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69026.log.gz 2026-03-25T15:45:44.449 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69066.log 2026-03-25T15:45:44.450 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69046.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69046.log.gz 2026-03-25T15:45:44.450 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69086.log 2026-03-25T15:45:44.450 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69066.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69066.log.gz 2026-03-25T15:45:44.450 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69106.log 2026-03-25T15:45:44.451 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69086.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69086.log.gz 2026-03-25T15:45:44.451 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69126.log 2026-03-25T15:45:44.451 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69106.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69106.log.gz 2026-03-25T15:45:44.452 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69146.log 2026-03-25T15:45:44.452 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69126.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69126.log.gz 2026-03-25T15:45:44.452 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69166.log 2026-03-25T15:45:44.452 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69146.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69146.log.gz 2026-03-25T15:45:44.452 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69186.log 2026-03-25T15:45:44.453 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69166.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69166.log.gz 2026-03-25T15:45:44.453 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69206.log 2026-03-25T15:45:44.453 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69186.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69186.log.gz 2026-03-25T15:45:44.453 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69226.log 2026-03-25T15:45:44.454 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69206.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69206.log.gz 2026-03-25T15:45:44.454 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69246.log 2026-03-25T15:45:44.454 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69226.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69226.log.gz 2026-03-25T15:45:44.454 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69266.log 2026-03-25T15:45:44.455 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69246.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69246.log.gz 2026-03-25T15:45:44.455 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69286.log 2026-03-25T15:45:44.455 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69266.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69266.log.gz 2026-03-25T15:45:44.455 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69306.log 2026-03-25T15:45:44.456 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69286.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69286.log.gz 2026-03-25T15:45:44.456 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69326.log 2026-03-25T15:45:44.456 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69306.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69306.log.gz 2026-03-25T15:45:44.456 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69346.log 2026-03-25T15:45:44.456 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69326.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69326.log.gz 2026-03-25T15:45:44.457 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69366.log 2026-03-25T15:45:44.457 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69346.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69346.log.gz 2026-03-25T15:45:44.457 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69386.log 2026-03-25T15:45:44.457 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69366.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69366.log.gz 2026-03-25T15:45:44.458 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69406.log 2026-03-25T15:45:44.458 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69386.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69386.log.gz 2026-03-25T15:45:44.458 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69426.log 2026-03-25T15:45:44.459 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69406.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69406.log.gz 2026-03-25T15:45:44.459 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69446.log 2026-03-25T15:45:44.459 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69426.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69426.log.gz 2026-03-25T15:45:44.460 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69466.log 2026-03-25T15:45:44.460 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69446.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69446.log.gz 2026-03-25T15:45:44.460 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69486.log 2026-03-25T15:45:44.460 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69466.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69466.log.gz 2026-03-25T15:45:44.461 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69506.log 2026-03-25T15:45:44.461 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69486.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69486.log.gz 2026-03-25T15:45:44.461 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69526.log 2026-03-25T15:45:44.461 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69506.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69506.log.gz 2026-03-25T15:45:44.462 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69546.log 2026-03-25T15:45:44.462 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69526.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69526.log.gz 2026-03-25T15:45:44.462 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69566.log 2026-03-25T15:45:44.463 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69546.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69546.log.gz 2026-03-25T15:45:44.463 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69586.log 2026-03-25T15:45:44.463 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69566.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69566.log.gz 2026-03-25T15:45:44.464 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69606.log 2026-03-25T15:45:44.464 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69586.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69586.log.gz 2026-03-25T15:45:44.464 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69626.log 2026-03-25T15:45:44.465 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69606.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69606.log.gz 2026-03-25T15:45:44.465 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69646.log 2026-03-25T15:45:44.465 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69626.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69626.log.gz 2026-03-25T15:45:44.466 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69666.log 2026-03-25T15:45:44.466 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69646.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69646.log.gz 2026-03-25T15:45:44.466 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69686.log 2026-03-25T15:45:44.467 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69666.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69666.log.gz 2026-03-25T15:45:44.467 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69706.log 2026-03-25T15:45:44.467 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69686.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69686.log.gz 2026-03-25T15:45:44.467 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69726.log 2026-03-25T15:45:44.468 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69706.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69706.log.gz 2026-03-25T15:45:44.468 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69746.log 2026-03-25T15:45:44.468 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69726.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69726.log.gz 2026-03-25T15:45:44.469 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69766.log 2026-03-25T15:45:44.469 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69746.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69746.log.gz 2026-03-25T15:45:44.469 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69786.log 2026-03-25T15:45:44.470 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69766.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69766.log.gz 2026-03-25T15:45:44.470 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69806.log 2026-03-25T15:45:44.470 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69786.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69786.log.gz 2026-03-25T15:45:44.471 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69826.log 2026-03-25T15:45:44.471 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69806.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69806.log.gz 2026-03-25T15:45:44.471 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69846.log 2026-03-25T15:45:44.471 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69826.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69826.log.gz 2026-03-25T15:45:44.472 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69866.log 2026-03-25T15:45:44.472 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69846.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69846.log.gz 2026-03-25T15:45:44.472 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69886.log 2026-03-25T15:45:44.473 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69866.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69866.log.gz 2026-03-25T15:45:44.473 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69906.log 2026-03-25T15:45:44.473 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69886.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69886.log.gz 2026-03-25T15:45:44.473 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69926.log 2026-03-25T15:45:44.474 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69906.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69906.log.gz 2026-03-25T15:45:44.474 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69946.log 2026-03-25T15:45:44.474 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69926.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69926.log.gz 2026-03-25T15:45:44.475 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69966.log 2026-03-25T15:45:44.475 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69946.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69946.log.gz 2026-03-25T15:45:44.475 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69986.log 2026-03-25T15:45:44.476 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69966.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69966.log.gz 2026-03-25T15:45:44.476 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70006.log 2026-03-25T15:45:44.476 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.69986.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69986.log.gz 2026-03-25T15:45:44.476 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70026.log 2026-03-25T15:45:44.477 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70006.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70006.log.gz 2026-03-25T15:45:44.477 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70046.log 2026-03-25T15:45:44.477 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70026.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70026.log.gz 2026-03-25T15:45:44.478 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70066.log 2026-03-25T15:45:44.478 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70046.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70046.log.gz 2026-03-25T15:45:44.478 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70086.log 2026-03-25T15:45:44.479 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70066.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70066.log.gz 2026-03-25T15:45:44.479 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70107.log 2026-03-25T15:45:44.479 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70086.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70086.log.gz 2026-03-25T15:45:44.479 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70124.log 2026-03-25T15:45:44.480 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70107.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70107.log.gz 2026-03-25T15:45:44.480 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70141.log 2026-03-25T15:45:44.480 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70124.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70124.log.gz 2026-03-25T15:45:44.481 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70158.log 2026-03-25T15:45:44.481 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70141.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70141.log.gz 2026-03-25T15:45:44.481 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70175.log 2026-03-25T15:45:44.481 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70158.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70158.log.gz 2026-03-25T15:45:44.482 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70192.log 2026-03-25T15:45:44.482 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70175.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70175.log.gz 2026-03-25T15:45:44.482 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70209.log 2026-03-25T15:45:44.483 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70192.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70192.log.gz 2026-03-25T15:45:44.483 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70226.log 2026-03-25T15:45:44.483 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70209.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70209.log.gz 2026-03-25T15:45:44.484 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70243.log 2026-03-25T15:45:44.484 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70226.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70226.log.gz 2026-03-25T15:45:44.484 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70260.log 2026-03-25T15:45:44.485 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70243.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70243.log.gz 2026-03-25T15:45:44.485 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70277.log 2026-03-25T15:45:44.485 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70260.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70260.log.gz 2026-03-25T15:45:44.485 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70294.log 2026-03-25T15:45:44.486 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70277.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70277.log.gz 2026-03-25T15:45:44.486 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70311.log 2026-03-25T15:45:44.486 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70294.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70294.log.gz 2026-03-25T15:45:44.487 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70328.log 2026-03-25T15:45:44.487 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70311.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70311.log.gz 2026-03-25T15:45:44.487 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70345.log 2026-03-25T15:45:44.487 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70328.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70328.log.gz 2026-03-25T15:45:44.488 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70362.log 2026-03-25T15:45:44.488 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70345.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70345.log.gz 2026-03-25T15:45:44.488 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70379.log 2026-03-25T15:45:44.489 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70362.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70362.log.gz 2026-03-25T15:45:44.489 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70396.log 2026-03-25T15:45:44.489 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70379.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70379.log.gz 2026-03-25T15:45:44.489 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70413.log 2026-03-25T15:45:44.490 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70396.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70396.log.gz 2026-03-25T15:45:44.490 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70430.log 2026-03-25T15:45:44.490 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70413.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70413.log.gz 2026-03-25T15:45:44.491 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70447.log 2026-03-25T15:45:44.491 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70430.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70430.log.gz 2026-03-25T15:45:44.491 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70464.log 2026-03-25T15:45:44.491 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70447.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70447.log.gz 2026-03-25T15:45:44.492 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70481.log 2026-03-25T15:45:44.492 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70464.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70464.log.gz 2026-03-25T15:45:44.492 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70498.log 2026-03-25T15:45:44.492 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70481.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70481.log.gz 2026-03-25T15:45:44.492 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70515.log 2026-03-25T15:45:44.493 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70498.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70498.log.gz 2026-03-25T15:45:44.493 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70532.log 2026-03-25T15:45:44.493 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70515.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70515.log.gz 2026-03-25T15:45:44.493 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70549.log 2026-03-25T15:45:44.494 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70532.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70532.log.gz 2026-03-25T15:45:44.494 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70566.log 2026-03-25T15:45:44.499 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70549.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70549.log.gz 2026-03-25T15:45:44.499 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70583.log 2026-03-25T15:45:44.499 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70566.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70566.log.gz 2026-03-25T15:45:44.500 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70600.log 2026-03-25T15:45:44.500 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70583.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70583.log.gz 2026-03-25T15:45:44.500 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70617.log 2026-03-25T15:45:44.500 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70600.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70600.log.gz 2026-03-25T15:45:44.501 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70634.log 2026-03-25T15:45:44.501 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70617.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70617.log.gz 2026-03-25T15:45:44.501 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70651.log 2026-03-25T15:45:44.502 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70634.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70634.log.gz 2026-03-25T15:45:44.502 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70668.log 2026-03-25T15:45:44.502 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70651.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70651.log.gz 2026-03-25T15:45:44.502 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70685.log 2026-03-25T15:45:44.503 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70668.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70668.log.gz 2026-03-25T15:45:44.503 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70702.log 2026-03-25T15:45:44.503 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70685.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70685.log.gz 2026-03-25T15:45:44.503 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70719.log 2026-03-25T15:45:44.504 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70702.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70702.log.gz 2026-03-25T15:45:44.504 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70736.log 2026-03-25T15:45:44.504 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70719.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70719.log.gz 2026-03-25T15:45:44.504 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70753.log 2026-03-25T15:45:44.505 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70736.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70736.log.gz 2026-03-25T15:45:44.505 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70770.log 2026-03-25T15:45:44.505 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70753.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70753.log.gz 2026-03-25T15:45:44.506 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70788.log 2026-03-25T15:45:44.506 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70770.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70770.log.gz 2026-03-25T15:45:44.506 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70805.log 2026-03-25T15:45:44.506 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70788.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70788.log.gz 2026-03-25T15:45:44.507 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70822.log 2026-03-25T15:45:44.507 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70805.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70805.log.gz 2026-03-25T15:45:44.507 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70839.log 2026-03-25T15:45:44.507 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70822.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70822.log.gz 2026-03-25T15:45:44.508 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70856.log 2026-03-25T15:45:44.508 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70839.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70839.log.gz 2026-03-25T15:45:44.508 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70873.log 2026-03-25T15:45:44.509 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70856.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70856.log.gz 2026-03-25T15:45:44.509 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70890.log 2026-03-25T15:45:44.509 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70873.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70873.log.gz 2026-03-25T15:45:44.509 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70907.log 2026-03-25T15:45:44.510 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70890.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70890.log.gz 2026-03-25T15:45:44.510 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70924.log 2026-03-25T15:45:44.510 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70907.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70907.log.gz 2026-03-25T15:45:44.511 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70941.log 2026-03-25T15:45:44.511 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70924.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70924.log.gz 2026-03-25T15:45:44.511 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70958.log 2026-03-25T15:45:44.511 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70941.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70941.log.gz 2026-03-25T15:45:44.512 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70975.log 2026-03-25T15:45:44.512 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70958.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70958.log.gz 2026-03-25T15:45:44.512 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70992.log 2026-03-25T15:45:44.512 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70975.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70975.log.gz 2026-03-25T15:45:44.513 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71009.log 2026-03-25T15:45:44.513 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.70992.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70992.log.gz 2026-03-25T15:45:44.513 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71026.log 2026-03-25T15:45:44.513 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71009.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71009.log.gz 2026-03-25T15:45:44.514 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71043.log 2026-03-25T15:45:44.514 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71026.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71026.log.gz 2026-03-25T15:45:44.514 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71060.log 2026-03-25T15:45:44.515 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71043.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71043.log.gz 2026-03-25T15:45:44.515 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71077.log 2026-03-25T15:45:44.515 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71060.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71060.log.gz 2026-03-25T15:45:44.516 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71094.log 2026-03-25T15:45:44.516 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71077.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71077.log.gz 2026-03-25T15:45:44.516 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71111.log 2026-03-25T15:45:44.516 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71094.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71094.log.gz 2026-03-25T15:45:44.517 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71128.log 2026-03-25T15:45:44.517 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71111.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71111.log.gz 2026-03-25T15:45:44.517 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71145.log 2026-03-25T15:45:44.518 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71128.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71128.log.gz 2026-03-25T15:45:44.518 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71162.log 2026-03-25T15:45:44.518 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71145.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71145.log.gz 2026-03-25T15:45:44.519 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71179.log 2026-03-25T15:45:44.519 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71162.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71162.log.gz 2026-03-25T15:45:44.519 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71196.log 2026-03-25T15:45:44.519 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71179.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71179.log.gz 2026-03-25T15:45:44.520 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71213.log 2026-03-25T15:45:44.520 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71196.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71196.log.gz 2026-03-25T15:45:44.520 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71230.log 2026-03-25T15:45:44.520 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71213.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71213.log.gz 2026-03-25T15:45:44.521 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71247.log 2026-03-25T15:45:44.521 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71230.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71230.log.gz 2026-03-25T15:45:44.521 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71264.log 2026-03-25T15:45:44.522 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71247.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71247.log.gz 2026-03-25T15:45:44.522 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71281.log 2026-03-25T15:45:44.522 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71264.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71264.log.gz 2026-03-25T15:45:44.522 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71298.log 2026-03-25T15:45:44.523 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71281.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71281.log.gz 2026-03-25T15:45:44.523 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71315.log 2026-03-25T15:45:44.523 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71298.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71298.log.gz 2026-03-25T15:45:44.523 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71332.log 2026-03-25T15:45:44.524 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71315.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71315.log.gz 2026-03-25T15:45:44.524 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71349.log 2026-03-25T15:45:44.524 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71332.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71332.log.gz 2026-03-25T15:45:44.524 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71366.log 2026-03-25T15:45:44.525 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71349.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71349.log.gz 2026-03-25T15:45:44.525 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71383.log 2026-03-25T15:45:44.525 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71366.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71366.log.gz 2026-03-25T15:45:44.526 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71400.log 2026-03-25T15:45:44.526 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71383.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71383.log.gz 2026-03-25T15:45:44.526 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71417.log 2026-03-25T15:45:44.526 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71400.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71400.log.gz 2026-03-25T15:45:44.527 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71434.log 2026-03-25T15:45:44.527 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71417.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71417.log.gz 2026-03-25T15:45:44.527 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71451.log 2026-03-25T15:45:44.527 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71434.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71434.log.gz 2026-03-25T15:45:44.528 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71468.log 2026-03-25T15:45:44.528 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71451.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71451.log.gz 2026-03-25T15:45:44.528 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71485.log 2026-03-25T15:45:44.529 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71468.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71468.log.gz 2026-03-25T15:45:44.529 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71502.log 2026-03-25T15:45:44.529 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71485.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71485.log.gz 2026-03-25T15:45:44.529 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71519.log 2026-03-25T15:45:44.530 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71502.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71502.log.gz 2026-03-25T15:45:44.530 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71536.log 2026-03-25T15:45:44.530 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71519.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71519.log.gz 2026-03-25T15:45:44.531 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71553.log 2026-03-25T15:45:44.531 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71536.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71536.log.gz 2026-03-25T15:45:44.531 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71570.log 2026-03-25T15:45:44.531 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71553.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71553.log.gz 2026-03-25T15:45:44.532 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71587.log 2026-03-25T15:45:44.532 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71570.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71570.log.gz 2026-03-25T15:45:44.532 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71604.log 2026-03-25T15:45:44.533 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71587.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71587.log.gz 2026-03-25T15:45:44.533 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71621.log 2026-03-25T15:45:44.533 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71604.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71604.log.gz 2026-03-25T15:45:44.533 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71638.log 2026-03-25T15:45:44.534 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71621.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71621.log.gz 2026-03-25T15:45:44.534 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71655.log 2026-03-25T15:45:44.534 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71638.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71638.log.gz 2026-03-25T15:45:44.535 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71672.log 2026-03-25T15:45:44.535 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71655.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71655.log.gz 2026-03-25T15:45:44.535 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71689.log 2026-03-25T15:45:44.535 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71672.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71672.log.gz 2026-03-25T15:45:44.536 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71706.log 2026-03-25T15:45:44.536 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71689.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71689.log.gz 2026-03-25T15:45:44.536 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71723.log 2026-03-25T15:45:44.537 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71706.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71706.log.gz 2026-03-25T15:45:44.537 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71740.log 2026-03-25T15:45:44.537 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71723.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71723.log.gz 2026-03-25T15:45:44.537 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71757.log 2026-03-25T15:45:44.538 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71740.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71740.log.gz 2026-03-25T15:45:44.538 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71774.log 2026-03-25T15:45:44.538 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71757.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71757.log.gz 2026-03-25T15:45:44.539 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71791.log 2026-03-25T15:45:44.539 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71774.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71774.log.gz 2026-03-25T15:45:44.539 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71808.log 2026-03-25T15:45:44.539 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71791.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71791.log.gz 2026-03-25T15:45:44.540 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71827.log 2026-03-25T15:45:44.540 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71808.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71808.log.gz 2026-03-25T15:45:44.540 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71851.log 2026-03-25T15:45:44.540 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71827.log: 56.1% -- replaced with /var/log/ceph/ceph-client.admin.71827.log.gz 2026-03-25T15:45:44.541 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71872.log 2026-03-25T15:45:44.541 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71851.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71851.log.gz 2026-03-25T15:45:44.541 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71893.log 2026-03-25T15:45:44.542 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71872.log: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.71872.log.gz 2026-03-25T15:45:44.542 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71914.log 2026-03-25T15:45:44.542 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71893.log: 25.7% -- replaced with /var/log/ceph/ceph-client.admin.71893.log.gz 2026-03-25T15:45:44.542 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71935.log 2026-03-25T15:45:44.543 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71914.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.71914.log.gz 2026-03-25T15:45:44.543 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71956.log 2026-03-25T15:45:44.543 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71935.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.71935.log.gz 2026-03-25T15:45:44.544 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71977.log 2026-03-25T15:45:44.544 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71956.log: 26.2% -- replaced with /var/log/ceph/ceph-client.admin.71956.log.gz 2026-03-25T15:45:44.544 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71998.log 2026-03-25T15:45:44.545 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71977.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.71977.log.gz 2026-03-25T15:45:44.545 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72019.log 2026-03-25T15:45:44.545 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.71998.log: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.71998.log.gz 2026-03-25T15:45:44.545 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72040.log 2026-03-25T15:45:44.546 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72019.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72019.log.gz 2026-03-25T15:45:44.546 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72061.log 2026-03-25T15:45:44.546 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72040.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.72040.log.gz 2026-03-25T15:45:44.547 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72082.log 2026-03-25T15:45:44.547 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72061.log: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.72061.log.gz 2026-03-25T15:45:44.547 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72103.log 2026-03-25T15:45:44.547 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72082.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72082.log.gz 2026-03-25T15:45:44.548 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72124.log 2026-03-25T15:45:44.548 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72103.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.72103.log.gz 2026-03-25T15:45:44.548 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72145.log 2026-03-25T15:45:44.549 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72124.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.72124.log.gz 2026-03-25T15:45:44.549 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72166.log 2026-03-25T15:45:44.549 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72145.log: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.72145.log.gz 2026-03-25T15:45:44.549 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72187.log 2026-03-25T15:45:44.550 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72166.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.72166.log.gz 2026-03-25T15:45:44.550 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72208.log 2026-03-25T15:45:44.550 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72187.log: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.72187.log.gz 2026-03-25T15:45:44.551 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72229.log 2026-03-25T15:45:44.551 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72208.log: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.72208.log.gz 2026-03-25T15:45:44.551 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72250.log 2026-03-25T15:45:44.551 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72229.log: 26.8% -- replaced with /var/log/ceph/ceph-client.admin.72229.log.gz 2026-03-25T15:45:44.552 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72271.log 2026-03-25T15:45:44.552 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72250.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72250.log.gz 2026-03-25T15:45:44.552 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72292.log 2026-03-25T15:45:44.553 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72271.log: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.72271.log.gz 2026-03-25T15:45:44.553 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72313.log 2026-03-25T15:45:44.553 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72292.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.72292.log.gz 2026-03-25T15:45:44.553 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72334.log 2026-03-25T15:45:44.554 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72313.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.72313.log.gz 2026-03-25T15:45:44.554 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72355.log 2026-03-25T15:45:44.554 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72334.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.72334.log.gz 2026-03-25T15:45:44.555 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72376.log 2026-03-25T15:45:44.555 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72355.log: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.72355.log.gz 2026-03-25T15:45:44.555 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72397.log 2026-03-25T15:45:44.556 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72376.log: 26.6% -- replaced with /var/log/ceph/ceph-client.admin.72376.log.gz 2026-03-25T15:45:44.556 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72418.log 2026-03-25T15:45:44.556 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72397.log: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.72397.log.gz 2026-03-25T15:45:44.557 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72439.log 2026-03-25T15:45:44.557 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72418.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72418.log.gz 2026-03-25T15:45:44.557 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72460.log 2026-03-25T15:45:44.557 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72439.log: 27.1% -- replaced with /var/log/ceph/ceph-client.admin.72439.log.gz 2026-03-25T15:45:44.558 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72481.log 2026-03-25T15:45:44.558 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72460.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72460.log.gz 2026-03-25T15:45:44.558 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72502.log 2026-03-25T15:45:44.559 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72481.log: 26.2% -- replaced with /var/log/ceph/ceph-client.admin.72481.log.gz 2026-03-25T15:45:44.559 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72523.log 2026-03-25T15:45:44.559 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72502.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72502.log.gz 2026-03-25T15:45:44.560 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72544.log 2026-03-25T15:45:44.560 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72523.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.72523.log.gz 2026-03-25T15:45:44.560 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72565.log 2026-03-25T15:45:44.561 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72544.log: 26.2% -- replaced with /var/log/ceph/ceph-client.admin.72544.log.gz 2026-03-25T15:45:44.561 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72586.log 2026-03-25T15:45:44.561 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72565.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.72565.log.gz 2026-03-25T15:45:44.561 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72607.log 2026-03-25T15:45:44.562 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72586.log: 25.7% -- replaced with /var/log/ceph/ceph-client.admin.72586.log.gz 2026-03-25T15:45:44.562 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72628.log 2026-03-25T15:45:44.562 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72607.log: 25.7% -- replaced with /var/log/ceph/ceph-client.admin.72607.log.gz 2026-03-25T15:45:44.563 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72649.log 2026-03-25T15:45:44.563 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72628.log: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.72628.log.gz 2026-03-25T15:45:44.563 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72670.log 2026-03-25T15:45:44.563 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72649.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72649.log.gz 2026-03-25T15:45:44.564 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72691.log 2026-03-25T15:45:44.564 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72670.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72670.log.gz 2026-03-25T15:45:44.564 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72712.log 2026-03-25T15:45:44.565 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72691.log: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.72691.log.gz 2026-03-25T15:45:44.565 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72733.log 2026-03-25T15:45:44.565 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72712.log: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.72712.log.gz 2026-03-25T15:45:44.565 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72754.log 2026-03-25T15:45:44.566 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72733.log: 27.5% -- replaced with /var/log/ceph/ceph-client.admin.72733.log.gz 2026-03-25T15:45:44.566 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72775.log 2026-03-25T15:45:44.566 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72754.log: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.72754.log.gz 2026-03-25T15:45:44.566 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72796.log 2026-03-25T15:45:44.567 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72775.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72775.log.gz 2026-03-25T15:45:44.567 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72817.log 2026-03-25T15:45:44.567 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72796.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.72796.log.gz 2026-03-25T15:45:44.568 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72838.log 2026-03-25T15:45:44.568 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72817.log: 25.7% -- replaced with /var/log/ceph/ceph-client.admin.72817.log.gz 2026-03-25T15:45:44.568 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72859.log 2026-03-25T15:45:44.568 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72838.log: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.72838.log.gz 2026-03-25T15:45:44.569 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72880.log 2026-03-25T15:45:44.569 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72859.log: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.72859.log.gz 2026-03-25T15:45:44.569 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72901.log 2026-03-25T15:45:44.570 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72880.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72880.log.gz 2026-03-25T15:45:44.570 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72922.log 2026-03-25T15:45:44.570 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72901.log: 26.6% -- replaced with /var/log/ceph/ceph-client.admin.72901.log.gz 2026-03-25T15:45:44.570 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72943.log 2026-03-25T15:45:44.571 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72922.log: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.72922.log.gz 2026-03-25T15:45:44.571 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72964.log 2026-03-25T15:45:44.571 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72943.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72943.log.gz 2026-03-25T15:45:44.572 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72985.log 2026-03-25T15:45:44.572 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72964.log: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.72964.log.gz 2026-03-25T15:45:44.572 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73006.log 2026-03-25T15:45:44.572 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.72985.log: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.72985.log.gz 2026-03-25T15:45:44.573 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73027.log 2026-03-25T15:45:44.573 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73006.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.73006.log.gz 2026-03-25T15:45:44.573 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73048.log 2026-03-25T15:45:44.573 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73027.log: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.73027.log.gz 2026-03-25T15:45:44.574 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73069.log 2026-03-25T15:45:44.574 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73048.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.73048.log.gz 2026-03-25T15:45:44.574 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73090.log 2026-03-25T15:45:44.575 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73069.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73069.log.gz 2026-03-25T15:45:44.575 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73111.log 2026-03-25T15:45:44.575 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73090.log: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.73090.log.gz 2026-03-25T15:45:44.576 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73132.log 2026-03-25T15:45:44.576 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73111.log: 26.8% -- replaced with /var/log/ceph/ceph-client.admin.73111.log.gz 2026-03-25T15:45:44.576 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73153.log 2026-03-25T15:45:44.577 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73132.log: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.73132.log.gz 2026-03-25T15:45:44.577 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73174.log 2026-03-25T15:45:44.577 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73153.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73153.log.gz 2026-03-25T15:45:44.577 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73195.log 2026-03-25T15:45:44.578 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73174.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.73174.log.gz 2026-03-25T15:45:44.578 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73216.log 2026-03-25T15:45:44.578 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73195.log: 27.1% -- replaced with /var/log/ceph/ceph-client.admin.73195.log.gz 2026-03-25T15:45:44.578 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73237.log 2026-03-25T15:45:44.579 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73216.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73216.log.gz 2026-03-25T15:45:44.579 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73258.log 2026-03-25T15:45:44.579 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73237.log: 26.8% -- replaced with /var/log/ceph/ceph-client.admin.73237.log.gz 2026-03-25T15:45:44.579 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73279.log 2026-03-25T15:45:44.580 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73258.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.73258.log.gz 2026-03-25T15:45:44.580 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73300.log 2026-03-25T15:45:44.580 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73279.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.73279.log.gz 2026-03-25T15:45:44.581 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73321.log 2026-03-25T15:45:44.581 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73300.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73300.log.gz 2026-03-25T15:45:44.581 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73342.log 2026-03-25T15:45:44.581 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73321.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.73321.log.gz 2026-03-25T15:45:44.582 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73363.log 2026-03-25T15:45:44.582 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73342.log: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.73342.log.gz 2026-03-25T15:45:44.582 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73384.log 2026-03-25T15:45:44.582 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73363.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.73363.log.gz 2026-03-25T15:45:44.583 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73405.log 2026-03-25T15:45:44.583 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73384.log: 26.8% -- replaced with /var/log/ceph/ceph-client.admin.73384.log.gz 2026-03-25T15:45:44.583 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73426.log 2026-03-25T15:45:44.583 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73405.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73405.log.gz 2026-03-25T15:45:44.584 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73447.log 2026-03-25T15:45:44.584 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73426.log: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.73426.log.gz 2026-03-25T15:45:44.584 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73468.log 2026-03-25T15:45:44.585 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73447.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73447.log.gz 2026-03-25T15:45:44.585 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73489.log 2026-03-25T15:45:44.585 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73468.log: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.73468.log.gz 2026-03-25T15:45:44.585 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73510.log 2026-03-25T15:45:44.586 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73489.log: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.73489.log.gz 2026-03-25T15:45:44.586 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73531.log 2026-03-25T15:45:44.586 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73510.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73510.log.gz 2026-03-25T15:45:44.586 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73552.log 2026-03-25T15:45:44.587 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73531.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.73531.log.gz 2026-03-25T15:45:44.587 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73573.log 2026-03-25T15:45:44.587 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73552.log: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.73552.log.gz 2026-03-25T15:45:44.588 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73594.log 2026-03-25T15:45:44.588 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73573.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.73573.log.gz 2026-03-25T15:45:44.588 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73615.log 2026-03-25T15:45:44.588 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73594.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73594.log.gz 2026-03-25T15:45:44.589 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73636.log 2026-03-25T15:45:44.589 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73615.log: 27.1% -- replaced with /var/log/ceph/ceph-client.admin.73615.log.gz 2026-03-25T15:45:44.589 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73657.log 2026-03-25T15:45:44.589 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73636.log: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.73636.log.gz 2026-03-25T15:45:44.590 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73678.log 2026-03-25T15:45:44.590 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73657.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73657.log.gz 2026-03-25T15:45:44.590 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73699.log 2026-03-25T15:45:44.591 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73678.log: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.73678.log.gz 2026-03-25T15:45:44.591 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73720.log 2026-03-25T15:45:44.591 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73699.log: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.73699.log.gz 2026-03-25T15:45:44.591 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73741.log 2026-03-25T15:45:44.592 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73720.log: 26.8% -- replaced with /var/log/ceph/ceph-client.admin.73720.log.gz 2026-03-25T15:45:44.592 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73762.log 2026-03-25T15:45:44.592 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73741.log: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.73741.log.gz 2026-03-25T15:45:44.593 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73783.log 2026-03-25T15:45:44.593 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73762.log: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.73762.log.gz 2026-03-25T15:45:44.593 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73804.log 2026-03-25T15:45:44.593 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73783.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.73783.log.gz 2026-03-25T15:45:44.594 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73825.log 2026-03-25T15:45:44.594 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73804.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.73804.log.gz 2026-03-25T15:45:44.595 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73846.log 2026-03-25T15:45:44.595 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73825.log: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.73825.log.gz 2026-03-25T15:45:44.595 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73867.log 2026-03-25T15:45:44.596 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73846.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73846.log.gz 2026-03-25T15:45:44.596 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73888.log 2026-03-25T15:45:44.596 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73867.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.73867.log.gz 2026-03-25T15:45:44.596 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73909.log 2026-03-25T15:45:44.597 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73888.log: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.73888.log.gz 2026-03-25T15:45:44.597 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73930.log 2026-03-25T15:45:44.597 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73909.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.73909.log.gz 2026-03-25T15:45:44.598 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73953.log 2026-03-25T15:45:44.598 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73930.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73930.log.gz 2026-03-25T15:45:44.598 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73974.log 2026-03-25T15:45:44.598 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73953.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73953.log.gz 2026-03-25T15:45:44.599 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73996.log 2026-03-25T15:45:44.599 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73974.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73974.log.gz 2026-03-25T15:45:44.599 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74017.log 2026-03-25T15:45:44.600 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.73996.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73996.log.gz 2026-03-25T15:45:44.600 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74039.log 2026-03-25T15:45:44.600 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74017.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74017.log.gz 2026-03-25T15:45:44.600 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74060.log 2026-03-25T15:45:44.601 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74039.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74039.log.gz 2026-03-25T15:45:44.601 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74082.log 2026-03-25T15:45:44.601 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74060.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74060.log.gz 2026-03-25T15:45:44.601 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74103.log 2026-03-25T15:45:44.602 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74082.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74082.log.gz 2026-03-25T15:45:44.602 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74125.log 2026-03-25T15:45:44.602 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74103.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74103.log.gz 2026-03-25T15:45:44.602 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74146.log 2026-03-25T15:45:44.603 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74125.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74125.log.gz 2026-03-25T15:45:44.603 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74168.log 2026-03-25T15:45:44.603 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74146.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74146.log.gz 2026-03-25T15:45:44.604 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74189.log 2026-03-25T15:45:44.604 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74168.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74168.log.gz 2026-03-25T15:45:44.604 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74211.log 2026-03-25T15:45:44.604 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74189.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74189.log.gz 2026-03-25T15:45:44.605 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74232.log 2026-03-25T15:45:44.605 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74211.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74211.log.gz 2026-03-25T15:45:44.605 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74254.log 2026-03-25T15:45:44.605 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74232.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74232.log.gz 2026-03-25T15:45:44.606 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74275.log 2026-03-25T15:45:44.606 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74254.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74254.log.gz 2026-03-25T15:45:44.606 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74297.log 2026-03-25T15:45:44.607 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74275.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74275.log.gz 2026-03-25T15:45:44.607 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74318.log 2026-03-25T15:45:44.607 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74297.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74297.log.gz 2026-03-25T15:45:44.607 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74340.log 2026-03-25T15:45:44.608 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74318.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74318.log.gz 2026-03-25T15:45:44.608 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74361.log 2026-03-25T15:45:44.608 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74340.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74340.log.gz 2026-03-25T15:45:44.608 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74383.log 2026-03-25T15:45:44.609 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74361.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74361.log.gz 2026-03-25T15:45:44.609 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74404.log 2026-03-25T15:45:44.609 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74383.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74383.log.gz 2026-03-25T15:45:44.609 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74426.log 2026-03-25T15:45:44.610 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74404.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74404.log.gz 2026-03-25T15:45:44.610 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74447.log 2026-03-25T15:45:44.610 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74426.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74426.log.gz 2026-03-25T15:45:44.611 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74469.log 2026-03-25T15:45:44.611 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74447.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74447.log.gz 2026-03-25T15:45:44.611 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74490.log 2026-03-25T15:45:44.611 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74469.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74469.log.gz 2026-03-25T15:45:44.612 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74512.log 2026-03-25T15:45:44.612 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74490.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74490.log.gz 2026-03-25T15:45:44.612 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74533.log 2026-03-25T15:45:44.613 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74512.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74512.log.gz 2026-03-25T15:45:44.613 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74555.log 2026-03-25T15:45:44.613 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74533.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74533.log.gz 2026-03-25T15:45:44.614 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74576.log 2026-03-25T15:45:44.614 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74555.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74555.log.gz 2026-03-25T15:45:44.614 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74598.log 2026-03-25T15:45:44.614 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74576.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74576.log.gz 2026-03-25T15:45:44.615 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74619.log 2026-03-25T15:45:44.615 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74598.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74598.log.gz 2026-03-25T15:45:44.615 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74641.log 2026-03-25T15:45:44.615 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74619.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74619.log.gz 2026-03-25T15:45:44.616 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74662.log 2026-03-25T15:45:44.616 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74641.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74641.log.gz 2026-03-25T15:45:44.616 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74684.log 2026-03-25T15:45:44.617 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74662.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74662.log.gz 2026-03-25T15:45:44.617 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74705.log 2026-03-25T15:45:44.617 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74684.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74684.log.gz 2026-03-25T15:45:44.617 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74726.log 2026-03-25T15:45:44.618 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74705.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74705.log.gz 2026-03-25T15:45:44.618 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74747.log 2026-03-25T15:45:44.618 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74726.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74726.log.gz 2026-03-25T15:45:44.619 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74764.log 2026-03-25T15:45:44.619 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74747.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.74747.log.gz 2026-03-25T15:45:44.619 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74784.log 2026-03-25T15:45:44.619 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74764.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74764.log.gz 2026-03-25T15:45:44.620 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74803.log 2026-03-25T15:45:44.620 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74784.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74784.log.gz 2026-03-25T15:45:44.620 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74820.log 2026-03-25T15:45:44.620 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74803.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74803.log.gz 2026-03-25T15:45:44.621 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74841.log 2026-03-25T15:45:44.621 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74820.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.74820.log.gz 2026-03-25T15:45:44.621 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74860.log 2026-03-25T15:45:44.622 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74841.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74841.log.gz 2026-03-25T15:45:44.622 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74877.log 2026-03-25T15:45:44.623 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74860.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.74860.log.gz 2026-03-25T15:45:44.623 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74898.log 2026-03-25T15:45:44.623 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74877.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74877.log.gz 2026-03-25T15:45:44.623 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74918.log 2026-03-25T15:45:44.624 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74898.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74898.log.gz 2026-03-25T15:45:44.624 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74937.log 2026-03-25T15:45:44.624 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74918.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74918.log.gz 2026-03-25T15:45:44.625 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74955.log 2026-03-25T15:45:44.625 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74937.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74937.log.gz 2026-03-25T15:45:44.625 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74977.log 2026-03-25T15:45:44.626 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74955.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74955.log.gz 2026-03-25T15:45:44.626 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74998.log 2026-03-25T15:45:44.626 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74977.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74977.log.gz 2026-03-25T15:45:44.626 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75019.log 2026-03-25T15:45:44.627 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.74998.log: 65.7% -- replaced with /var/log/ceph/ceph-client.admin.74998.log.gz 2026-03-25T15:45:44.627 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75038.log 2026-03-25T15:45:44.627 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75019.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75019.log.gz 2026-03-25T15:45:44.628 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75055.log 2026-03-25T15:45:44.628 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75038.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75038.log.gz 2026-03-25T15:45:44.628 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75076.log 2026-03-25T15:45:44.628 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75055.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75055.log.gz 2026-03-25T15:45:44.629 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75097.log 2026-03-25T15:45:44.629 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75076.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.75076.log.gz 2026-03-25T15:45:44.629 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75116.log 2026-03-25T15:45:44.629 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75097.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75097.log.gz 2026-03-25T15:45:44.630 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75134.log 2026-03-25T15:45:44.630 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75116.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75116.log.gz 2026-03-25T15:45:44.630 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75156.log 2026-03-25T15:45:44.631 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75134.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75134.log.gz 2026-03-25T15:45:44.631 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75177.log 2026-03-25T15:45:44.631 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75156.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75156.log.gz 2026-03-25T15:45:44.631 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75198.log 2026-03-25T15:45:44.632 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75177.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75177.log.gz 2026-03-25T15:45:44.632 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75219.log 2026-03-25T15:45:44.632 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75198.log: 67.1% -- replaced with /var/log/ceph/ceph-client.admin.75198.log.gz 2026-03-25T15:45:44.633 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75238.log 2026-03-25T15:45:44.633 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75219.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75219.log.gz 2026-03-25T15:45:44.633 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75255.log 2026-03-25T15:45:44.634 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75238.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75238.log.gz 2026-03-25T15:45:44.634 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75275.log 2026-03-25T15:45:44.634 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75255.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75255.log.gz 2026-03-25T15:45:44.635 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75295.log 2026-03-25T15:45:44.635 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75275.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75275.log.gz 2026-03-25T15:45:44.635 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75315.log 2026-03-25T15:45:44.636 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75295.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75295.log.gz 2026-03-25T15:45:44.636 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75336.log 2026-03-25T15:45:44.636 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75315.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75315.log.gz 2026-03-25T15:45:44.636 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75357.log 2026-03-25T15:45:44.637 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75336.log: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.75336.log.gz 2026-03-25T15:45:44.637 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75377.log 2026-03-25T15:45:44.637 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75357.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75357.log.gz 2026-03-25T15:45:44.637 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75397.log 2026-03-25T15:45:44.638 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75377.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75377.log.gz 2026-03-25T15:45:44.638 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75417.log 2026-03-25T15:45:44.638 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75397.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75397.log.gz 2026-03-25T15:45:44.639 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75439.log 2026-03-25T15:45:44.639 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75417.log: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.75417.log.gz 2026-03-25T15:45:44.639 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75460.log 2026-03-25T15:45:44.639 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75439.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75439.log.gz 2026-03-25T15:45:44.640 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75482.log 2026-03-25T15:45:44.640 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75460.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75460.log.gz 2026-03-25T15:45:44.640 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75503.log 2026-03-25T15:45:44.640 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75482.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75482.log.gz 2026-03-25T15:45:44.641 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75525.log 2026-03-25T15:45:44.641 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75503.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75503.log.gz 2026-03-25T15:45:44.641 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75546.log 2026-03-25T15:45:44.641 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75525.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75525.log.gz 2026-03-25T15:45:44.642 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75568.log 2026-03-25T15:45:44.642 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75546.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75546.log.gz 2026-03-25T15:45:44.642 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75589.log 2026-03-25T15:45:44.642 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75568.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75568.log.gz 2026-03-25T15:45:44.643 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75611.log 2026-03-25T15:45:44.643 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75589.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75589.log.gz 2026-03-25T15:45:44.643 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75632.log 2026-03-25T15:45:44.644 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75611.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75611.log.gz 2026-03-25T15:45:44.644 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75654.log 2026-03-25T15:45:44.644 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75632.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75632.log.gz 2026-03-25T15:45:44.644 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75675.log 2026-03-25T15:45:44.645 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75654.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75654.log.gz 2026-03-25T15:45:44.645 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75697.log 2026-03-25T15:45:44.645 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75675.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75675.log.gz 2026-03-25T15:45:44.645 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75718.log 2026-03-25T15:45:44.646 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75697.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75697.log.gz 2026-03-25T15:45:44.646 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75740.log 2026-03-25T15:45:44.646 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75718.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75718.log.gz 2026-03-25T15:45:44.646 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75761.log 2026-03-25T15:45:44.647 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75740.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75740.log.gz 2026-03-25T15:45:44.647 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75783.log 2026-03-25T15:45:44.647 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75761.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75761.log.gz 2026-03-25T15:45:44.647 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75804.log 2026-03-25T15:45:44.648 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75783.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75783.log.gz 2026-03-25T15:45:44.648 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75826.log 2026-03-25T15:45:44.648 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75804.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75804.log.gz 2026-03-25T15:45:44.649 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75847.log 2026-03-25T15:45:44.649 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75826.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75826.log.gz 2026-03-25T15:45:44.649 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75869.log 2026-03-25T15:45:44.649 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75847.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75847.log.gz 2026-03-25T15:45:44.650 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75890.log 2026-03-25T15:45:44.650 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75869.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75869.log.gz 2026-03-25T15:45:44.650 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75912.log 2026-03-25T15:45:44.650 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75890.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75890.log.gz 2026-03-25T15:45:44.651 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75933.log 2026-03-25T15:45:44.651 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75912.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75912.log.gz 2026-03-25T15:45:44.651 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75955.log 2026-03-25T15:45:44.652 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75933.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75933.log.gz 2026-03-25T15:45:44.652 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75976.log 2026-03-25T15:45:44.652 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75955.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75955.log.gz 2026-03-25T15:45:44.652 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75998.log 2026-03-25T15:45:44.653 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75976.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75976.log.gz 2026-03-25T15:45:44.653 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76019.log 2026-03-25T15:45:44.653 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.75998.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75998.log.gz 2026-03-25T15:45:44.653 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76041.log 2026-03-25T15:45:44.654 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76019.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76019.log.gz 2026-03-25T15:45:44.654 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76062.log 2026-03-25T15:45:44.654 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76041.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76041.log.gz 2026-03-25T15:45:44.655 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76084.log 2026-03-25T15:45:44.655 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76062.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76062.log.gz 2026-03-25T15:45:44.655 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76105.log 2026-03-25T15:45:44.655 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76084.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76084.log.gz 2026-03-25T15:45:44.656 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76127.log 2026-03-25T15:45:44.656 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76105.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76105.log.gz 2026-03-25T15:45:44.656 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76148.log 2026-03-25T15:45:44.656 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76127.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76127.log.gz 2026-03-25T15:45:44.657 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76170.log 2026-03-25T15:45:44.657 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76148.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76148.log.gz 2026-03-25T15:45:44.657 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76191.log 2026-03-25T15:45:44.658 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76170.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76170.log.gz 2026-03-25T15:45:44.658 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76212.log 2026-03-25T15:45:44.658 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76191.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76191.log.gz 2026-03-25T15:45:44.658 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76238.log 2026-03-25T15:45:44.659 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76212.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76212.log.gz 2026-03-25T15:45:44.659 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76255.log 2026-03-25T15:45:44.659 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76238.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76238.log.gz 2026-03-25T15:45:44.659 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76272.log 2026-03-25T15:45:44.660 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76255.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.76255.log.gz 2026-03-25T15:45:44.660 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76293.log 2026-03-25T15:45:44.660 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76272.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76272.log.gz 2026-03-25T15:45:44.660 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76314.log 2026-03-25T15:45:44.661 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76293.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76293.log.gz 2026-03-25T15:45:44.661 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76335.log 2026-03-25T15:45:44.661 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76314.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76314.log.gz 2026-03-25T15:45:44.661 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76356.log 2026-03-25T15:45:44.662 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76335.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76335.log.gz 2026-03-25T15:45:44.662 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76376.log 2026-03-25T15:45:44.662 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76356.log: 17.8% -- replaced with /var/log/ceph/ceph-client.admin.76356.log.gz 2026-03-25T15:45:44.662 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76397.log 2026-03-25T15:45:44.663 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76376.log: 54.8% -- replaced with /var/log/ceph/ceph-client.admin.76376.log.gz 2026-03-25T15:45:44.663 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76418.log 2026-03-25T15:45:44.663 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76397.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76397.log.gz 2026-03-25T15:45:44.663 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76438.log 2026-03-25T15:45:44.664 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76418.log: 59.4% -- replaced with /var/log/ceph/ceph-client.admin.76418.log.gz 2026-03-25T15:45:44.664 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76459.log 2026-03-25T15:45:44.664 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76438.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76438.log.gz 2026-03-25T15:45:44.664 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76480.log 2026-03-25T15:45:44.665 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76459.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76459.log.gz 2026-03-25T15:45:44.665 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76500.log 2026-03-25T15:45:44.665 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76480.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76480.log.gz 2026-03-25T15:45:44.665 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76521.log 2026-03-25T15:45:44.666 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76500.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76500.log.gz 2026-03-25T15:45:44.666 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76541.log 2026-03-25T15:45:44.666 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76521.log: 53.8% -- replaced with /var/log/ceph/ceph-client.admin.76521.log.gz 2026-03-25T15:45:44.666 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76562.log 2026-03-25T15:45:44.667 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76541.log: 55.3% -- replaced with /var/log/ceph/ceph-client.admin.76541.log.gz 2026-03-25T15:45:44.667 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76583.log 2026-03-25T15:45:44.667 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76562.log: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.76562.log.gz 2026-03-25T15:45:44.668 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76604.log 2026-03-25T15:45:44.668 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76583.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76583.log.gz 2026-03-25T15:45:44.668 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76623.log 2026-03-25T15:45:44.668 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76604.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76604.log.gz 2026-03-25T15:45:44.669 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76641.log 2026-03-25T15:45:44.669 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76623.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76623.log.gz 2026-03-25T15:45:44.669 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76662.log 2026-03-25T15:45:44.670 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76641.log: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.76641.log.gz 2026-03-25T15:45:44.670 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76683.log 2026-03-25T15:45:44.670 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76662.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76662.log.gz 2026-03-25T15:45:44.670 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76703.log 2026-03-25T15:45:44.671 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76683.log: 17.2% -- replaced with /var/log/ceph/ceph-client.admin.76683.log.gz 2026-03-25T15:45:44.671 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76724.log 2026-03-25T15:45:44.671 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76703.log: 26.8% -- replaced with /var/log/ceph/ceph-client.admin.76703.log.gz 2026-03-25T15:45:44.671 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76741.log 2026-03-25T15:45:44.672 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76724.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76724.log.gz 2026-03-25T15:45:44.672 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76758.log 2026-03-25T15:45:44.672 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76741.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76741.log.gz 2026-03-25T15:45:44.672 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76779.log 2026-03-25T15:45:44.673 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76758.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.76758.log.gz 2026-03-25T15:45:44.673 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76800.log 2026-03-25T15:45:44.673 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76779.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76779.log.gz 2026-03-25T15:45:44.673 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76821.log 2026-03-25T15:45:44.674 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76800.log: 58.6% -- replaced with /var/log/ceph/ceph-client.admin.76800.log.gz 2026-03-25T15:45:44.674 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76842.log 2026-03-25T15:45:44.674 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76821.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76821.log.gz 2026-03-25T15:45:44.674 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76863.log 2026-03-25T15:45:44.675 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76842.log: 26.2% -- replaced with /var/log/ceph/ceph-client.admin.76842.log.gz 2026-03-25T15:45:44.675 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76883.log 2026-03-25T15:45:44.675 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76863.log: 26.9% -- replaced with /var/log/ceph/ceph-client.admin.76863.log.gz 2026-03-25T15:45:44.675 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76903.log 2026-03-25T15:45:44.676 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76883.log: 58.2% -- replaced with /var/log/ceph/ceph-client.admin.76883.log.gz 2026-03-25T15:45:44.676 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76924.log 2026-03-25T15:45:44.676 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76903.log: 52.7% -- replaced with /var/log/ceph/ceph-client.admin.76903.log.gz 2026-03-25T15:45:44.676 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76941.log 2026-03-25T15:45:44.677 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76924.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.76924.log.gz 2026-03-25T15:45:44.677 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76961.log 2026-03-25T15:45:44.677 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76941.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76941.log.gz 2026-03-25T15:45:44.677 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76982.log 2026-03-25T15:45:44.677 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76961.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76961.log.gz 2026-03-25T15:45:44.678 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77002.log 2026-03-25T15:45:44.678 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.76982.log: 55.5% -- replaced with /var/log/ceph/ceph-client.admin.76982.log.gz 2026-03-25T15:45:44.678 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77022.log 2026-03-25T15:45:44.678 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77002.log: 54.9% -- replaced with /var/log/ceph/ceph-client.admin.77002.log.gz 2026-03-25T15:45:44.679 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77042.log 2026-03-25T15:45:44.679 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77022.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77022.log.gz 2026-03-25T15:45:44.679 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77064.log 2026-03-25T15:45:44.679 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77042.log: 10.2% -- replaced with /var/log/ceph/ceph-client.admin.77042.log.gz 2026-03-25T15:45:44.680 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77082.log 2026-03-25T15:45:44.680 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77064.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77064.log.gz 2026-03-25T15:45:44.680 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77102.log 2026-03-25T15:45:44.680 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77082.log: 53.8% -- replaced with /var/log/ceph/ceph-client.admin.77082.log.gz 2026-03-25T15:45:44.681 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77119.log 2026-03-25T15:45:44.681 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77102.log: 15.1% -- replaced with /var/log/ceph/ceph-client.admin.77102.log.gz 2026-03-25T15:45:44.681 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77140.log 2026-03-25T15:45:44.681 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77119.log: 58.1% -- replaced with /var/log/ceph/ceph-client.admin.77119.log.gz 2026-03-25T15:45:44.682 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77162.log 2026-03-25T15:45:44.682 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77140.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77140.log.gz 2026-03-25T15:45:44.682 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77183.log 2026-03-25T15:45:44.682 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77162.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77162.log.gz 2026-03-25T15:45:44.682 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77204.log 2026-03-25T15:45:44.683 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77183.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77183.log.gz 2026-03-25T15:45:44.683 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77224.log 2026-03-25T15:45:44.683 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77204.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77204.log.gz 2026-03-25T15:45:44.683 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77244.log 2026-03-25T15:45:44.684 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77224.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77224.log.gz 2026-03-25T15:45:44.684 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77264.log 2026-03-25T15:45:44.684 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77244.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77244.log.gz 2026-03-25T15:45:44.684 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77284.log 2026-03-25T15:45:44.685 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77264.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77264.log.gz 2026-03-25T15:45:44.685 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77304.log 2026-03-25T15:45:44.685 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77284.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77284.log.gz 2026-03-25T15:45:44.685 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77325.log 2026-03-25T15:45:44.685 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77304.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77304.log.gz 2026-03-25T15:45:44.686 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77346.log 2026-03-25T15:45:44.686 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77325.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77325.log.gz 2026-03-25T15:45:44.686 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77368.log 2026-03-25T15:45:44.687 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77346.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77346.log.gz 2026-03-25T15:45:44.687 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77390.log 2026-03-25T15:45:44.687 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77368.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77368.log.gz 2026-03-25T15:45:44.687 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77411.log 2026-03-25T15:45:44.688 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77390.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77390.log.gz 2026-03-25T15:45:44.688 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77431.log 2026-03-25T15:45:44.688 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77411.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77411.log.gz 2026-03-25T15:45:44.688 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77451.log 2026-03-25T15:45:44.689 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77431.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77431.log.gz 2026-03-25T15:45:44.689 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77471.log 2026-03-25T15:45:44.689 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77451.log: 58.8% -- replaced with /var/log/ceph/ceph-client.admin.77451.log.gz 2026-03-25T15:45:44.689 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77492.log 2026-03-25T15:45:44.690 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77471.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77471.log.gz 2026-03-25T15:45:44.690 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77513.log 2026-03-25T15:45:44.690 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77492.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77492.log.gz 2026-03-25T15:45:44.690 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77534.log 2026-03-25T15:45:44.691 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77513.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77513.log.gz 2026-03-25T15:45:44.691 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77554.log 2026-03-25T15:45:44.691 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77534.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77534.log.gz 2026-03-25T15:45:44.691 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77574.log 2026-03-25T15:45:44.692 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77554.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77554.log.gz 2026-03-25T15:45:44.692 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77594.log 2026-03-25T15:45:44.692 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77574.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77574.log.gz 2026-03-25T15:45:44.692 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77614.log 2026-03-25T15:45:44.693 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77594.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77594.log.gz 2026-03-25T15:45:44.693 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77636.log 2026-03-25T15:45:44.693 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77614.log: 54.9% -- replaced with /var/log/ceph/ceph-client.admin.77614.log.gz 2026-03-25T15:45:44.693 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77658.log 2026-03-25T15:45:44.694 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77636.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77636.log.gz 2026-03-25T15:45:44.694 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77679.log 2026-03-25T15:45:44.694 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77658.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77658.log.gz 2026-03-25T15:45:44.694 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77699.log 2026-03-25T15:45:44.695 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77679.log: 57.7% -- replaced with /var/log/ceph/ceph-client.admin.77679.log.gz 2026-03-25T15:45:44.695 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77720.log 2026-03-25T15:45:44.695 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77699.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77699.log.gz 2026-03-25T15:45:44.695 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77741.log 2026-03-25T15:45:44.696 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77720.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77720.log.gz 2026-03-25T15:45:44.696 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77762.log 2026-03-25T15:45:44.696 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77741.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77741.log.gz 2026-03-25T15:45:44.697 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77782.log 2026-03-25T15:45:44.697 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77802.log 2026-03-25T15:45:44.697 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77782.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77782.log.gz 2026-03-25T15:45:44.697 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77822.log 2026-03-25T15:45:44.698 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77802.log: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.77802.log.gz 2026-03-25T15:45:44.698 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77762.log: gzip 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77762.log.gz -5 2026-03-25T15:45:44.698 INFO:teuthology.orchestra.run.vm04.stderr: --verbose -- /var/log/ceph/ceph-client.admin.77842.log 2026-03-25T15:45:44.698 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77822.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.77822.log.gz 2026-03-25T15:45:44.699 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77864.log 2026-03-25T15:45:44.699 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77842.log: 58.6% -- replaced with /var/log/ceph/ceph-client.admin.77842.log.gz 2026-03-25T15:45:44.699 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77886.log 2026-03-25T15:45:44.700 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77864.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77864.log.gz 2026-03-25T15:45:44.700 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77907.log 2026-03-25T15:45:44.700 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77886.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77886.log.gz 2026-03-25T15:45:44.700 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77928.log 2026-03-25T15:45:44.701 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77907.log: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.77907.log.gz 2026-03-25T15:45:44.701 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77949.log 2026-03-25T15:45:44.702 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77928.log: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.77928.log.gz 2026-03-25T15:45:44.702 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77969.log 2026-03-25T15:45:44.702 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77949.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77949.log.gz 2026-03-25T15:45:44.702 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77989.log 2026-03-25T15:45:44.703 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77969.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77969.log.gz 2026-03-25T15:45:44.703 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78010.log 2026-03-25T15:45:44.703 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.77989.log: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.77989.log.gz 2026-03-25T15:45:44.704 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78027.log 2026-03-25T15:45:44.704 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78010.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.78010.log.gz 2026-03-25T15:45:44.704 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78047.log 2026-03-25T15:45:44.705 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78027.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78027.log.gz 2026-03-25T15:45:44.705 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78068.log 2026-03-25T15:45:44.705 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78047.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78047.log.gz 2026-03-25T15:45:44.706 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78089.log 2026-03-25T15:45:44.706 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78068.log: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.78068.log.gz 2026-03-25T15:45:44.706 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78110.log 2026-03-25T15:45:44.707 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78089.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78089.log.gz 2026-03-25T15:45:44.707 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78130.log 2026-03-25T15:45:44.707 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78110.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78110.log.gz 2026-03-25T15:45:44.708 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78147.log 2026-03-25T15:45:44.708 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78130.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.78130.log.gz 2026-03-25T15:45:44.708 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78167.log 2026-03-25T15:45:44.709 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78147.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78147.log.gz 2026-03-25T15:45:44.709 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78188.log 2026-03-25T15:45:44.710 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78167.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78167.log.gz 2026-03-25T15:45:44.710 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78208.log 2026-03-25T15:45:44.710 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78188.log: 54.9% -- replaced with /var/log/ceph/ceph-client.admin.78188.log.gz 2026-03-25T15:45:44.711 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78229.log 2026-03-25T15:45:44.711 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78208.log: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.78208.log.gz 2026-03-25T15:45:44.711 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78250.log 2026-03-25T15:45:44.712 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78229.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78229.log.gz 2026-03-25T15:45:44.712 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78270.log 2026-03-25T15:45:44.712 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78250.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78250.log.gz 2026-03-25T15:45:44.713 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78287.log 2026-03-25T15:45:44.713 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78270.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.78270.log.gz 2026-03-25T15:45:44.713 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78307.log 2026-03-25T15:45:44.714 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78287.log: 32.2% -- replaced with /var/log/ceph/ceph-client.admin.78287.log.gz 2026-03-25T15:45:44.714 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78328.log 2026-03-25T15:45:44.714 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78307.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78307.log.gz 2026-03-25T15:45:44.715 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78348.log 2026-03-25T15:45:44.715 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78328.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78328.log.gz 2026-03-25T15:45:44.715 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78365.log 2026-03-25T15:45:44.716 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78348.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.78348.log.gz 2026-03-25T15:45:44.716 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78385.log 2026-03-25T15:45:44.716 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78365.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78365.log.gz 2026-03-25T15:45:44.716 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78406.log 2026-03-25T15:45:44.717 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78385.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78385.log.gz 2026-03-25T15:45:44.717 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78428.log 2026-03-25T15:45:44.717 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78406.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78406.log.gz 2026-03-25T15:45:44.718 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78449.log 2026-03-25T15:45:44.718 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78428.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78428.log.gz 2026-03-25T15:45:44.718 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78469.log 2026-03-25T15:45:44.719 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78449.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78449.log.gz 2026-03-25T15:45:44.719 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78486.log 2026-03-25T15:45:44.719 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78469.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.78469.log.gz 2026-03-25T15:45:44.720 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78506.log 2026-03-25T15:45:44.720 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78486.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78486.log.gz 2026-03-25T15:45:44.720 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78527.log 2026-03-25T15:45:44.721 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78506.log: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.78506.log.gz 2026-03-25T15:45:44.721 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78548.log 2026-03-25T15:45:44.721 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78527.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78527.log.gz 2026-03-25T15:45:44.721 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78568.log 2026-03-25T15:45:44.722 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78548.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78548.log.gz 2026-03-25T15:45:44.722 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78585.log 2026-03-25T15:45:44.722 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78568.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78568.log.gz 2026-03-25T15:45:44.723 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78605.log 2026-03-25T15:45:44.723 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78585.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.78585.log.gz 2026-03-25T15:45:44.723 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78626.log 2026-03-25T15:45:44.724 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78605.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78605.log.gz 2026-03-25T15:45:44.724 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78647.log 2026-03-25T15:45:44.724 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78626.log: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.78626.log.gz 2026-03-25T15:45:44.724 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78668.log 2026-03-25T15:45:44.725 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78647.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78647.log.gz 2026-03-25T15:45:44.725 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78689.log 2026-03-25T15:45:44.725 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78668.log: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.78668.log.gz 2026-03-25T15:45:44.726 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78706.log 2026-03-25T15:45:44.726 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78689.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78689.log.gz 2026-03-25T15:45:44.726 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78726.log 2026-03-25T15:45:44.727 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78706.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.78706.log.gz 2026-03-25T15:45:44.727 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78747.log 2026-03-25T15:45:44.727 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78726.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78726.log.gz 2026-03-25T15:45:44.727 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78767.log 2026-03-25T15:45:44.728 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78747.log: 58.6% -- replaced with /var/log/ceph/ceph-client.admin.78747.log.gz 2026-03-25T15:45:44.728 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78788.log 2026-03-25T15:45:44.728 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78767.log: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.78767.log.gz 2026-03-25T15:45:44.729 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78809.log 2026-03-25T15:45:44.729 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78788.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78788.log.gz 2026-03-25T15:45:44.729 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78830.log 2026-03-25T15:45:44.730 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78809.log: 25.4% -- replaced with /var/log/ceph/ceph-client.admin.78809.log.gz 2026-03-25T15:45:44.730 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78847.log 2026-03-25T15:45:44.730 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78830.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78830.log.gz 2026-03-25T15:45:44.730 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78867.log 2026-03-25T15:45:44.731 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78847.log: 37.4% -- replaced with /var/log/ceph/ceph-client.admin.78847.log.gz 2026-03-25T15:45:44.731 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78888.log 2026-03-25T15:45:44.731 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78867.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78867.log.gz 2026-03-25T15:45:44.732 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78909.log 2026-03-25T15:45:44.732 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78888.log: 27.1% -- replaced with /var/log/ceph/ceph-client.admin.78888.log.gz 2026-03-25T15:45:44.732 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78926.log 2026-03-25T15:45:44.733 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78909.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78909.log.gz 2026-03-25T15:45:44.733 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78946.log 2026-03-25T15:45:44.733 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78926.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.78926.log.gz 2026-03-25T15:45:44.733 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78967.log 2026-03-25T15:45:44.734 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78946.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78946.log.gz 2026-03-25T15:45:44.734 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78988.log 2026-03-25T15:45:44.734 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78967.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78967.log.gz 2026-03-25T15:45:44.735 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79009.log 2026-03-25T15:45:44.735 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.78988.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78988.log.gz 2026-03-25T15:45:44.735 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79030.log 2026-03-25T15:45:44.736 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79009.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.79009.log.gz 2026-03-25T15:45:44.736 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79047.log 2026-03-25T15:45:44.736 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79030.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79030.log.gz 2026-03-25T15:45:44.736 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79067.log 2026-03-25T15:45:44.737 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79047.log: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.79047.log.gz 2026-03-25T15:45:44.737 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79088.log 2026-03-25T15:45:44.737 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79067.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.79067.log.gz 2026-03-25T15:45:44.738 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79109.log 2026-03-25T15:45:44.738 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79088.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79088.log.gz 2026-03-25T15:45:44.738 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79130.log 2026-03-25T15:45:44.739 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79109.log: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.79109.log.gz 2026-03-25T15:45:44.739 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79147.log 2026-03-25T15:45:44.739 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79130.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79130.log.gz 2026-03-25T15:45:44.740 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79167.log 2026-03-25T15:45:44.740 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79147.log: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.79147.log.gz 2026-03-25T15:45:44.740 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79188.log 2026-03-25T15:45:44.741 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79167.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79167.log.gz 2026-03-25T15:45:44.741 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79209.log 2026-03-25T15:45:44.741 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79188.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79188.log.gz 2026-03-25T15:45:44.741 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79230.log 2026-03-25T15:45:44.742 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79209.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79209.log.gz 2026-03-25T15:45:44.742 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79252.log 2026-03-25T15:45:44.742 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79230.log: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.79230.log.gz 2026-03-25T15:45:44.743 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79273.log 2026-03-25T15:45:44.743 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79252.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79252.log.gz 2026-03-25T15:45:44.743 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79295.log 2026-03-25T15:45:44.744 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79273.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79273.log.gz 2026-03-25T15:45:44.744 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79316.log 2026-03-25T15:45:44.744 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79295.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79295.log.gz 2026-03-25T15:45:44.744 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79338.log 2026-03-25T15:45:44.745 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79316.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79316.log.gz 2026-03-25T15:45:44.745 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79359.log 2026-03-25T15:45:44.745 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79338.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79338.log.gz 2026-03-25T15:45:44.746 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79381.log 2026-03-25T15:45:44.746 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79359.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79359.log.gz 2026-03-25T15:45:44.747 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79402.log 2026-03-25T15:45:44.747 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79381.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79381.log.gz 2026-03-25T15:45:44.747 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79424.log 2026-03-25T15:45:44.748 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79402.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79402.log.gz 2026-03-25T15:45:44.748 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79445.log 2026-03-25T15:45:44.748 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79424.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79424.log.gz 2026-03-25T15:45:44.749 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79467.log 2026-03-25T15:45:44.749 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79445.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79445.log.gz 2026-03-25T15:45:44.749 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79488.log 2026-03-25T15:45:44.749 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79467.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79467.log.gz 2026-03-25T15:45:44.750 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79510.log 2026-03-25T15:45:44.750 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79488.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79488.log.gz 2026-03-25T15:45:44.750 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79531.log 2026-03-25T15:45:44.751 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79510.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79510.log.gz 2026-03-25T15:45:44.751 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79553.log 2026-03-25T15:45:44.751 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79531.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79531.log.gz 2026-03-25T15:45:44.752 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79574.log 2026-03-25T15:45:44.752 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79553.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79553.log.gz 2026-03-25T15:45:44.752 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79596.log 2026-03-25T15:45:44.753 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79574.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79574.log.gz 2026-03-25T15:45:44.753 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79617.log 2026-03-25T15:45:44.753 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79596.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79596.log.gz 2026-03-25T15:45:44.753 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79639.log 2026-03-25T15:45:44.754 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79617.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79617.log.gz 2026-03-25T15:45:44.754 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79660.log 2026-03-25T15:45:44.754 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79639.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79639.log.gz 2026-03-25T15:45:44.755 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79682.log 2026-03-25T15:45:44.755 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79660.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79660.log.gz 2026-03-25T15:45:44.755 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79703.log 2026-03-25T15:45:44.755 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79682.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79682.log.gz 2026-03-25T15:45:44.756 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79725.log 2026-03-25T15:45:44.756 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79703.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79703.log.gz 2026-03-25T15:45:44.756 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79746.log 2026-03-25T15:45:44.757 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79725.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79725.log.gz 2026-03-25T15:45:44.757 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79768.log 2026-03-25T15:45:44.757 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79746.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79746.log.gz 2026-03-25T15:45:44.757 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79789.log 2026-03-25T15:45:44.758 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79768.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79768.log.gz 2026-03-25T15:45:44.758 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79811.log 2026-03-25T15:45:44.758 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79789.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79789.log.gz 2026-03-25T15:45:44.759 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79832.log 2026-03-25T15:45:44.759 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79811.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79811.log.gz 2026-03-25T15:45:44.759 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79854.log 2026-03-25T15:45:44.760 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79832.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79832.log.gz 2026-03-25T15:45:44.760 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79875.log 2026-03-25T15:45:44.760 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79854.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79854.log.gz 2026-03-25T15:45:44.761 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79897.log 2026-03-25T15:45:44.761 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79875.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79875.log.gz 2026-03-25T15:45:44.761 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79918.log 2026-03-25T15:45:44.762 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79897.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79897.log.gz 2026-03-25T15:45:44.762 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79940.log 2026-03-25T15:45:44.762 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79918.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79918.log.gz 2026-03-25T15:45:44.762 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79961.log 2026-03-25T15:45:44.763 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79940.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79940.log.gz 2026-03-25T15:45:44.763 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79983.log 2026-03-25T15:45:44.763 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79961.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79961.log.gz 2026-03-25T15:45:44.763 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80004.log 2026-03-25T15:45:44.764 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.79983.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79983.log.gz 2026-03-25T15:45:44.764 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80025.log 2026-03-25T15:45:44.764 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80004.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80004.log.gz 2026-03-25T15:45:44.765 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80050.log 2026-03-25T15:45:44.765 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80025.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80025.log.gz 2026-03-25T15:45:44.765 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80071.log 2026-03-25T15:45:44.765 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80050.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80050.log.gz 2026-03-25T15:45:44.766 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80093.log 2026-03-25T15:45:44.766 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80071.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80071.log.gz 2026-03-25T15:45:44.766 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80114.log 2026-03-25T15:45:44.767 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80093.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80093.log.gz 2026-03-25T15:45:44.767 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80136.log 2026-03-25T15:45:44.767 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80114.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80114.log.gz 2026-03-25T15:45:44.767 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80157.log 2026-03-25T15:45:44.768 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80136.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80136.log.gz 2026-03-25T15:45:44.768 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80179.log 2026-03-25T15:45:44.768 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80157.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80157.log.gz 2026-03-25T15:45:44.769 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80200.log 2026-03-25T15:45:44.769 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80179.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80179.log.gz 2026-03-25T15:45:44.769 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80222.log 2026-03-25T15:45:44.769 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80200.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80200.log.gz 2026-03-25T15:45:44.770 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80243.log 2026-03-25T15:45:44.770 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80222.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80222.log.gz 2026-03-25T15:45:44.770 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80265.log 2026-03-25T15:45:44.771 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80243.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80243.log.gz 2026-03-25T15:45:44.771 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80286.log 2026-03-25T15:45:44.771 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80265.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80265.log.gz 2026-03-25T15:45:44.771 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80308.log 2026-03-25T15:45:44.772 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80286.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80286.log.gz 2026-03-25T15:45:44.772 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80329.log 2026-03-25T15:45:44.772 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80308.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80308.log.gz 2026-03-25T15:45:44.772 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80351.log 2026-03-25T15:45:44.773 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80329.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80329.log.gz 2026-03-25T15:45:44.773 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80372.log 2026-03-25T15:45:44.773 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80351.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80351.log.gz 2026-03-25T15:45:44.774 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80394.log 2026-03-25T15:45:44.774 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80372.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80372.log.gz 2026-03-25T15:45:44.774 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80415.log 2026-03-25T15:45:44.774 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80394.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80394.log.gz 2026-03-25T15:45:44.775 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80437.log 2026-03-25T15:45:44.775 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80415.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80415.log.gz 2026-03-25T15:45:44.775 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80458.log 2026-03-25T15:45:44.776 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80437.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80437.log.gz 2026-03-25T15:45:44.776 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80480.log 2026-03-25T15:45:44.776 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80458.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80458.log.gz 2026-03-25T15:45:44.776 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80501.log 2026-03-25T15:45:44.777 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80480.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80480.log.gz 2026-03-25T15:45:44.777 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80523.log 2026-03-25T15:45:44.777 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80501.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80501.log.gz 2026-03-25T15:45:44.777 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80544.log 2026-03-25T15:45:44.778 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80523.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80523.log.gz 2026-03-25T15:45:44.778 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80566.log 2026-03-25T15:45:44.778 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80544.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80544.log.gz 2026-03-25T15:45:44.779 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80587.log 2026-03-25T15:45:44.779 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80566.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80566.log.gz 2026-03-25T15:45:44.779 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80609.log 2026-03-25T15:45:44.780 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80587.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80587.log.gz 2026-03-25T15:45:44.780 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80630.log 2026-03-25T15:45:44.780 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80609.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80609.log.gz 2026-03-25T15:45:44.780 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80652.log 2026-03-25T15:45:44.781 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80630.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80630.log.gz 2026-03-25T15:45:44.781 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80673.log 2026-03-25T15:45:44.781 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80652.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80652.log.gz 2026-03-25T15:45:44.782 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80695.log 2026-03-25T15:45:44.782 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80673.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80673.log.gz 2026-03-25T15:45:44.782 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80716.log 2026-03-25T15:45:44.782 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80695.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80695.log.gz 2026-03-25T15:45:44.783 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80738.log 2026-03-25T15:45:44.783 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80716.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80716.log.gz 2026-03-25T15:45:44.783 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80759.log 2026-03-25T15:45:44.783 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80738.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80738.log.gz 2026-03-25T15:45:44.784 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80781.log 2026-03-25T15:45:44.784 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80759.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80759.log.gz 2026-03-25T15:45:44.784 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80802.log 2026-03-25T15:45:44.785 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80781.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80781.log.gz 2026-03-25T15:45:44.785 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80823.log 2026-03-25T15:45:44.785 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80802.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80802.log.gz 2026-03-25T15:45:44.785 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80825.log 2026-03-25T15:45:44.786 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80823.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80823.log.gz 2026-03-25T15:45:44.786 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80827.log 2026-03-25T15:45:44.786 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80825.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80825.log.gz 2026-03-25T15:45:44.786 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80829.log 2026-03-25T15:45:44.787 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80827.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80827.log.gz 2026-03-25T15:45:44.787 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80846.log 2026-03-25T15:45:44.787 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80829.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80829.log.gz 2026-03-25T15:45:44.788 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80863.log 2026-03-25T15:45:44.788 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80846.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80846.log.gz 2026-03-25T15:45:44.788 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80880.log 2026-03-25T15:45:44.788 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80863.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80863.log.gz 2026-03-25T15:45:44.789 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80897.log 2026-03-25T15:45:44.789 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80880.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80880.log.gz 2026-03-25T15:45:44.789 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80915.log 2026-03-25T15:45:44.790 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80897.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80897.log.gz 2026-03-25T15:45:44.790 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80933.log 2026-03-25T15:45:44.790 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80915.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80915.log.gz 2026-03-25T15:45:44.790 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80951.log 2026-03-25T15:45:44.791 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80933.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80933.log.gz 2026-03-25T15:45:44.791 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80968.log 2026-03-25T15:45:44.791 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80951.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80951.log.gz 2026-03-25T15:45:44.792 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80986.log 2026-03-25T15:45:44.792 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80968.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80968.log.gz 2026-03-25T15:45:44.792 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81004.log 2026-03-25T15:45:44.792 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.80986.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80986.log.gz 2026-03-25T15:45:44.793 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81022.log 2026-03-25T15:45:44.793 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81004.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81004.log.gz 2026-03-25T15:45:44.793 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81040.log 2026-03-25T15:45:44.794 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81022.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81022.log.gz 2026-03-25T15:45:44.794 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81057.log 2026-03-25T15:45:44.794 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81040.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81040.log.gz 2026-03-25T15:45:44.794 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81074.log 2026-03-25T15:45:44.795 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81057.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81057.log.gz 2026-03-25T15:45:44.795 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81092.log 2026-03-25T15:45:44.795 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81074.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81074.log.gz 2026-03-25T15:45:44.795 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81109.log 2026-03-25T15:45:44.796 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81092.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81092.log.gz 2026-03-25T15:45:44.796 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81126.log 2026-03-25T15:45:44.796 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81109.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81109.log.gz 2026-03-25T15:45:44.797 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81143.log 2026-03-25T15:45:44.797 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81126.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81126.log.gz 2026-03-25T15:45:44.797 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81161.log 2026-03-25T15:45:44.797 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81143.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81143.log.gz 2026-03-25T15:45:44.798 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81179.log 2026-03-25T15:45:44.798 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81161.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81161.log.gz 2026-03-25T15:45:44.798 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81196.log 2026-03-25T15:45:44.799 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81179.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.81179.log.gz 2026-03-25T15:45:44.799 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81217.log 2026-03-25T15:45:44.799 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81196.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81196.log.gz 2026-03-25T15:45:44.799 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81237.log 2026-03-25T15:45:44.800 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81217.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81217.log.gz 2026-03-25T15:45:44.800 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81258.log 2026-03-25T15:45:44.800 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81237.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81237.log.gz 2026-03-25T15:45:44.801 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81279.log 2026-03-25T15:45:44.801 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81258.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81258.log.gz 2026-03-25T15:45:44.801 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81299.log 2026-03-25T15:45:44.801 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81279.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81279.log.gz 2026-03-25T15:45:44.802 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81319.log 2026-03-25T15:45:44.802 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81299.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81299.log.gz 2026-03-25T15:45:44.802 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81340.log 2026-03-25T15:45:44.803 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81319.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81319.log.gz 2026-03-25T15:45:44.803 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81357.log 2026-03-25T15:45:44.803 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81340.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81340.log.gz 2026-03-25T15:45:44.803 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81374.log 2026-03-25T15:45:44.804 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81357.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81357.log.gz 2026-03-25T15:45:44.804 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81392.log 2026-03-25T15:45:44.804 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81374.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81374.log.gz 2026-03-25T15:45:44.804 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81413.log 2026-03-25T15:45:44.805 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81392.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81392.log.gz 2026-03-25T15:45:44.805 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81434.log 2026-03-25T15:45:44.805 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81413.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81413.log.gz 2026-03-25T15:45:44.806 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81456.log 2026-03-25T15:45:44.806 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81434.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81434.log.gz 2026-03-25T15:45:44.806 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81477.log 2026-03-25T15:45:44.806 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81456.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81456.log.gz 2026-03-25T15:45:44.807 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81499.log 2026-03-25T15:45:44.807 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81477.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81477.log.gz 2026-03-25T15:45:44.807 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81520.log 2026-03-25T15:45:44.808 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81499.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81499.log.gz 2026-03-25T15:45:44.808 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81542.log 2026-03-25T15:45:44.808 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81520.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81520.log.gz 2026-03-25T15:45:44.808 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81563.log 2026-03-25T15:45:44.809 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81542.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81542.log.gz 2026-03-25T15:45:44.809 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81585.log 2026-03-25T15:45:44.809 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81563.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81563.log.gz 2026-03-25T15:45:44.809 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81606.log 2026-03-25T15:45:44.810 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81585.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81585.log.gz 2026-03-25T15:45:44.810 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81628.log 2026-03-25T15:45:44.810 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81606.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81606.log.gz 2026-03-25T15:45:44.811 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81649.log 2026-03-25T15:45:44.811 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81628.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81628.log.gz 2026-03-25T15:45:44.811 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81671.log 2026-03-25T15:45:44.811 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81649.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81649.log.gz 2026-03-25T15:45:44.812 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81692.log 2026-03-25T15:45:44.812 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81671.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81671.log.gz 2026-03-25T15:45:44.812 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81714.log 2026-03-25T15:45:44.813 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81692.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81692.log.gz 2026-03-25T15:45:44.813 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81735.log 2026-03-25T15:45:44.813 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81714.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81714.log.gz 2026-03-25T15:45:44.813 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81757.log 2026-03-25T15:45:44.814 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81735.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81735.log.gz 2026-03-25T15:45:44.814 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81778.log 2026-03-25T15:45:44.814 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81757.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81757.log.gz 2026-03-25T15:45:44.814 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81800.log 2026-03-25T15:45:44.815 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81778.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81778.log.gz 2026-03-25T15:45:44.815 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81821.log 2026-03-25T15:45:44.815 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81800.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81800.log.gz 2026-03-25T15:45:44.815 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81843.log 2026-03-25T15:45:44.816 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81821.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81821.log.gz 2026-03-25T15:45:44.816 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81864.log 2026-03-25T15:45:44.816 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81843.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81843.log.gz 2026-03-25T15:45:44.817 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81886.log 2026-03-25T15:45:44.817 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81864.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81864.log.gz 2026-03-25T15:45:44.817 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81907.log 2026-03-25T15:45:44.818 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81886.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81886.log.gz 2026-03-25T15:45:44.818 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81929.log 2026-03-25T15:45:44.818 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81907.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81907.log.gz 2026-03-25T15:45:44.819 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81950.log 2026-03-25T15:45:44.819 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81929.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81929.log.gz 2026-03-25T15:45:44.819 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81972.log 2026-03-25T15:45:44.820 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81950.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81950.log.gz 2026-03-25T15:45:44.820 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81993.log 2026-03-25T15:45:44.820 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81972.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81972.log.gz 2026-03-25T15:45:44.821 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82015.log 2026-03-25T15:45:44.821 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.81993.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81993.log.gz 2026-03-25T15:45:44.821 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82036.log 2026-03-25T15:45:44.822 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82015.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82015.log.gz 2026-03-25T15:45:44.822 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82058.log 2026-03-25T15:45:44.823 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82036.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82036.log.gz 2026-03-25T15:45:44.823 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82079.log 2026-03-25T15:45:44.823 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82058.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82058.log.gz 2026-03-25T15:45:44.824 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82101.log 2026-03-25T15:45:44.824 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82079.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82079.log.gz 2026-03-25T15:45:44.824 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82122.log 2026-03-25T15:45:44.825 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82101.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82101.log.gz 2026-03-25T15:45:44.825 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82144.log 2026-03-25T15:45:44.825 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82122.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82122.log.gz 2026-03-25T15:45:44.826 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82165.log 2026-03-25T15:45:44.826 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82144.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82144.log.gz 2026-03-25T15:45:44.827 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82192.log 2026-03-25T15:45:44.827 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82165.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82165.log.gz 2026-03-25T15:45:44.827 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82212.log 2026-03-25T15:45:44.828 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82192.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.82192.log.gz 2026-03-25T15:45:44.828 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82232.log 2026-03-25T15:45:44.829 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82212.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82212.log.gz 2026-03-25T15:45:44.829 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82252.log 2026-03-25T15:45:44.830 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82232.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82232.log.gz 2026-03-25T15:45:44.830 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82272.log 2026-03-25T15:45:44.831 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82252.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82252.log.gz 2026-03-25T15:45:44.831 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82292.log 2026-03-25T15:45:44.831 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82272.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82272.log.gz 2026-03-25T15:45:44.832 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82312.log 2026-03-25T15:45:44.832 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82292.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82292.log.gz 2026-03-25T15:45:44.833 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82332.log 2026-03-25T15:45:44.833 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82312.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82312.log.gz 2026-03-25T15:45:44.833 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82353.log 2026-03-25T15:45:44.834 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82332.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82332.log.gz 2026-03-25T15:45:44.834 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82375.log 2026-03-25T15:45:44.835 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82353.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82353.log.gz 2026-03-25T15:45:44.835 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82395.log 2026-03-25T15:45:44.835 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82375.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82375.log.gz 2026-03-25T15:45:44.836 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82415.log 2026-03-25T15:45:44.836 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82395.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82395.log.gz 2026-03-25T15:45:44.837 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82432.log 2026-03-25T15:45:44.837 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82415.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.82415.log.gz 2026-03-25T15:45:44.837 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82452.log 2026-03-25T15:45:44.838 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82432.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82432.log.gz 2026-03-25T15:45:44.838 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82472.log 2026-03-25T15:45:44.838 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82452.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82452.log.gz 2026-03-25T15:45:44.839 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82493.log 2026-03-25T15:45:44.839 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82472.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82472.log.gz 2026-03-25T15:45:44.839 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82514.log 2026-03-25T15:45:44.840 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82493.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82493.log.gz 2026-03-25T15:45:44.840 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82535.log 2026-03-25T15:45:44.840 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82514.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82514.log.gz 2026-03-25T15:45:44.841 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82556.log 2026-03-25T15:45:44.841 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82535.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82535.log.gz 2026-03-25T15:45:44.841 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82576.log 2026-03-25T15:45:44.841 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82556.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82556.log.gz 2026-03-25T15:45:44.842 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82596.log 2026-03-25T15:45:44.842 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82576.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82576.log.gz 2026-03-25T15:45:44.842 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82616.log 2026-03-25T15:45:44.843 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82596.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82596.log.gz 2026-03-25T15:45:44.843 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82636.log 2026-03-25T15:45:44.843 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82616.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82616.log.gz 2026-03-25T15:45:44.844 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82657.log 2026-03-25T15:45:44.844 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82636.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82636.log.gz 2026-03-25T15:45:44.844 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82678.log 2026-03-25T15:45:44.845 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82657.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82657.log.gz 2026-03-25T15:45:44.845 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82699.log 2026-03-25T15:45:44.845 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82678.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82678.log.gz 2026-03-25T15:45:44.845 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82720.log 2026-03-25T15:45:44.846 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82699.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82699.log.gz 2026-03-25T15:45:44.846 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82740.log 2026-03-25T15:45:44.846 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82720.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82720.log.gz 2026-03-25T15:45:44.847 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82760.log 2026-03-25T15:45:44.847 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82740.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82740.log.gz 2026-03-25T15:45:44.847 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82781.log 2026-03-25T15:45:44.848 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82760.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82760.log.gz 2026-03-25T15:45:44.848 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82802.log 2026-03-25T15:45:44.848 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82781.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82781.log.gz 2026-03-25T15:45:44.848 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82825.log 2026-03-25T15:45:44.849 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82802.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82802.log.gz 2026-03-25T15:45:44.849 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82846.log 2026-03-25T15:45:44.849 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82825.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82825.log.gz 2026-03-25T15:45:44.849 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82866.log 2026-03-25T15:45:44.850 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82846.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82846.log.gz 2026-03-25T15:45:44.850 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82886.log 2026-03-25T15:45:44.850 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82866.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82866.log.gz 2026-03-25T15:45:44.851 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82906.log 2026-03-25T15:45:44.851 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82886.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82886.log.gz 2026-03-25T15:45:44.851 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82926.log 2026-03-25T15:45:44.852 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82906.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82906.log.gz 2026-03-25T15:45:44.852 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82946.log 2026-03-25T15:45:44.852 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82926.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82926.log.gz 2026-03-25T15:45:44.852 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82970.log 2026-03-25T15:45:44.853 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82946.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82946.log.gz 2026-03-25T15:45:44.853 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82990.log 2026-03-25T15:45:44.853 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82970.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82970.log.gz 2026-03-25T15:45:44.854 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83010.log 2026-03-25T15:45:44.854 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.82990.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82990.log.gz 2026-03-25T15:45:44.854 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83031.log 2026-03-25T15:45:44.855 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83010.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83010.log.gz 2026-03-25T15:45:44.855 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83052.log 2026-03-25T15:45:44.855 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83031.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83031.log.gz 2026-03-25T15:45:44.855 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83072.log 2026-03-25T15:45:44.856 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83052.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83052.log.gz 2026-03-25T15:45:44.856 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83094.log 2026-03-25T15:45:44.856 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83072.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83072.log.gz 2026-03-25T15:45:44.856 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83115.log 2026-03-25T15:45:44.857 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83094.log: 56.1% -- replaced with /var/log/ceph/ceph-client.admin.83094.log.gz 2026-03-25T15:45:44.857 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83136.log 2026-03-25T15:45:44.857 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83115.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83115.log.gz 2026-03-25T15:45:44.858 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83153.log 2026-03-25T15:45:44.858 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83136.log: 3.8% -- replaced with /var/log/ceph/ceph-client.admin.83136.log.gz 2026-03-25T15:45:44.858 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83173.log 2026-03-25T15:45:44.859 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83153.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83153.log.gz 2026-03-25T15:45:44.859 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83193.log 2026-03-25T15:45:44.859 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83173.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83173.log.gz 2026-03-25T15:45:44.859 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83213.log 2026-03-25T15:45:44.860 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83193.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83193.log.gz 2026-03-25T15:45:44.860 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83233.log 2026-03-25T15:45:44.860 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83213.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83213.log.gz 2026-03-25T15:45:44.861 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83254.log 2026-03-25T15:45:44.861 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83233.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83233.log.gz 2026-03-25T15:45:44.861 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83275.log 2026-03-25T15:45:44.861 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83254.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83254.log.gz 2026-03-25T15:45:44.862 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83296.log 2026-03-25T15:45:44.862 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83275.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83275.log.gz 2026-03-25T15:45:44.862 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83317.log 2026-03-25T15:45:44.863 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83296.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83296.log.gz 2026-03-25T15:45:44.863 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83338.log 2026-03-25T15:45:44.863 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83317.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83317.log.gz 2026-03-25T15:45:44.864 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83359.log 2026-03-25T15:45:44.864 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83338.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83338.log.gz 2026-03-25T15:45:44.864 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83380.log 2026-03-25T15:45:44.865 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83359.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83359.log.gz 2026-03-25T15:45:44.865 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83401.log 2026-03-25T15:45:44.865 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83380.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83380.log.gz 2026-03-25T15:45:44.865 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83423.log 2026-03-25T15:45:44.866 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83401.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83401.log.gz 2026-03-25T15:45:44.866 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83444.log 2026-03-25T15:45:44.866 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83423.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83423.log.gz 2026-03-25T15:45:44.867 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83466.log 2026-03-25T15:45:44.867 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83444.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83444.log.gz 2026-03-25T15:45:44.867 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83487.log 2026-03-25T15:45:44.867 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83466.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83466.log.gz 2026-03-25T15:45:44.868 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83509.log 2026-03-25T15:45:44.868 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83487.log: 54.8% -- replaced with /var/log/ceph/ceph-client.admin.83487.log.gz 2026-03-25T15:45:44.868 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83530.log 2026-03-25T15:45:44.869 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83509.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83509.log.gz 2026-03-25T15:45:44.869 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83552.log 2026-03-25T15:45:44.869 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83530.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83530.log.gz 2026-03-25T15:45:44.869 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83573.log 2026-03-25T15:45:44.870 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83552.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83552.log.gz 2026-03-25T15:45:44.870 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83594.log 2026-03-25T15:45:44.870 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83573.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83573.log.gz 2026-03-25T15:45:44.871 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83615.log 2026-03-25T15:45:44.871 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83594.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83594.log.gz 2026-03-25T15:45:44.871 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83637.log 2026-03-25T15:45:44.872 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83615.log: 56.1% -- replaced with /var/log/ceph/ceph-client.admin.83615.log.gz 2026-03-25T15:45:44.872 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83658.log 2026-03-25T15:45:44.872 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83637.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83637.log.gz 2026-03-25T15:45:44.872 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83680.log 2026-03-25T15:45:44.873 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83658.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83658.log.gz 2026-03-25T15:45:44.873 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83701.log 2026-03-25T15:45:44.873 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83680.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83680.log.gz 2026-03-25T15:45:44.873 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83723.log 2026-03-25T15:45:44.874 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83701.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83701.log.gz 2026-03-25T15:45:44.874 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83744.log 2026-03-25T15:45:44.874 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83723.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83723.log.gz 2026-03-25T15:45:44.875 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83766.log 2026-03-25T15:45:44.875 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83744.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83744.log.gz 2026-03-25T15:45:44.875 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83787.log 2026-03-25T15:45:44.876 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83766.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83766.log.gz 2026-03-25T15:45:44.876 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83809.log 2026-03-25T15:45:44.876 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83787.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83787.log.gz 2026-03-25T15:45:44.876 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83830.log 2026-03-25T15:45:44.877 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83809.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83809.log.gz 2026-03-25T15:45:44.877 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83852.log 2026-03-25T15:45:44.877 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83830.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83830.log.gz 2026-03-25T15:45:44.878 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83873.log 2026-03-25T15:45:44.878 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83852.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83852.log.gz 2026-03-25T15:45:44.878 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83895.log 2026-03-25T15:45:44.878 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83873.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83873.log.gz 2026-03-25T15:45:44.879 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83916.log 2026-03-25T15:45:44.879 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83895.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83895.log.gz 2026-03-25T15:45:44.879 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83938.log 2026-03-25T15:45:44.880 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83916.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83916.log.gz 2026-03-25T15:45:44.880 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83959.log 2026-03-25T15:45:44.880 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83938.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83938.log.gz 2026-03-25T15:45:44.880 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83981.log 2026-03-25T15:45:44.881 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83959.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83959.log.gz 2026-03-25T15:45:44.881 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84002.log 2026-03-25T15:45:44.881 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.83981.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83981.log.gz 2026-03-25T15:45:44.881 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84024.log 2026-03-25T15:45:44.882 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84002.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84002.log.gz 2026-03-25T15:45:44.882 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84045.log 2026-03-25T15:45:44.883 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84024.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84024.log.gz 2026-03-25T15:45:44.883 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84068.log 2026-03-25T15:45:44.883 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84045.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84045.log.gz 2026-03-25T15:45:44.883 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84089.log 2026-03-25T15:45:44.884 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84068.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84068.log.gz 2026-03-25T15:45:44.884 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84111.log 2026-03-25T15:45:44.884 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84089.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84089.log.gz 2026-03-25T15:45:44.885 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84132.log 2026-03-25T15:45:44.885 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84111.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84111.log.gz 2026-03-25T15:45:44.885 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84154.log 2026-03-25T15:45:44.885 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84132.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84132.log.gz 2026-03-25T15:45:44.886 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84175.log 2026-03-25T15:45:44.886 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84154.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84154.log.gz 2026-03-25T15:45:44.886 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84197.log 2026-03-25T15:45:44.887 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84175.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84175.log.gz 2026-03-25T15:45:44.887 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84218.log 2026-03-25T15:45:44.887 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84197.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84197.log.gz 2026-03-25T15:45:44.887 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84240.log 2026-03-25T15:45:44.888 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84218.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84218.log.gz 2026-03-25T15:45:44.888 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84261.log 2026-03-25T15:45:44.888 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84240.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84240.log.gz 2026-03-25T15:45:44.889 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84283.log 2026-03-25T15:45:44.889 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84261.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84261.log.gz 2026-03-25T15:45:44.889 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84304.log 2026-03-25T15:45:44.889 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84283.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84283.log.gz 2026-03-25T15:45:44.890 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84326.log 2026-03-25T15:45:44.890 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84304.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84304.log.gz 2026-03-25T15:45:44.890 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84347.log 2026-03-25T15:45:44.891 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84326.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84326.log.gz 2026-03-25T15:45:44.891 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84369.log 2026-03-25T15:45:44.891 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84347.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84347.log.gz 2026-03-25T15:45:44.891 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84390.log 2026-03-25T15:45:44.892 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84369.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84369.log.gz 2026-03-25T15:45:44.892 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84412.log 2026-03-25T15:45:44.892 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84390.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84390.log.gz 2026-03-25T15:45:44.893 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84433.log 2026-03-25T15:45:44.893 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84412.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84412.log.gz 2026-03-25T15:45:44.893 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84455.log 2026-03-25T15:45:44.893 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84433.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84433.log.gz 2026-03-25T15:45:44.894 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84476.log 2026-03-25T15:45:44.894 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84455.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84455.log.gz 2026-03-25T15:45:44.894 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84498.log 2026-03-25T15:45:44.895 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84476.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84476.log.gz 2026-03-25T15:45:44.895 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84519.log 2026-03-25T15:45:44.895 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84498.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84498.log.gz 2026-03-25T15:45:44.895 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84541.log 2026-03-25T15:45:44.896 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84519.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84519.log.gz 2026-03-25T15:45:44.896 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84562.log 2026-03-25T15:45:44.896 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84541.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84541.log.gz 2026-03-25T15:45:44.897 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84584.log 2026-03-25T15:45:44.897 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84562.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84562.log.gz 2026-03-25T15:45:44.897 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84605.log 2026-03-25T15:45:44.898 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84584.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84584.log.gz 2026-03-25T15:45:44.898 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84627.log 2026-03-25T15:45:44.898 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84605.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84605.log.gz 2026-03-25T15:45:44.898 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84648.log 2026-03-25T15:45:44.899 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84627.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84627.log.gz 2026-03-25T15:45:44.899 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84670.log 2026-03-25T15:45:44.899 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84648.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84648.log.gz 2026-03-25T15:45:44.899 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84691.log 2026-03-25T15:45:44.900 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84670.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84670.log.gz 2026-03-25T15:45:44.900 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84713.log 2026-03-25T15:45:44.900 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84691.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84691.log.gz 2026-03-25T15:45:44.901 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84734.log 2026-03-25T15:45:44.901 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84713.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84713.log.gz 2026-03-25T15:45:44.901 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84756.log 2026-03-25T15:45:44.902 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84734.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84734.log.gz 2026-03-25T15:45:44.902 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84777.log 2026-03-25T15:45:44.902 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84756.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84756.log.gz 2026-03-25T15:45:44.902 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84799.log 2026-03-25T15:45:44.903 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84777.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84777.log.gz 2026-03-25T15:45:44.903 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84820.log 2026-03-25T15:45:44.903 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84799.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84799.log.gz 2026-03-25T15:45:44.904 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84841.log 2026-03-25T15:45:44.904 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84820.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84820.log.gz 2026-03-25T15:45:44.904 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84858.log 2026-03-25T15:45:44.905 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84841.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.84841.log.gz 2026-03-25T15:45:44.905 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84880.log 2026-03-25T15:45:44.905 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84858.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84858.log.gz 2026-03-25T15:45:44.905 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84900.log 2026-03-25T15:45:44.906 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84880.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84880.log.gz 2026-03-25T15:45:44.906 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84922.log 2026-03-25T15:45:44.906 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84900.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84900.log.gz 2026-03-25T15:45:44.907 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84944.log 2026-03-25T15:45:44.907 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84922.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84922.log.gz 2026-03-25T15:45:44.907 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84964.log 2026-03-25T15:45:44.907 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84944.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84944.log.gz 2026-03-25T15:45:44.908 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84986.log 2026-03-25T15:45:44.908 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84964.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84964.log.gz 2026-03-25T15:45:44.908 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85006.log 2026-03-25T15:45:44.909 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.84986.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84986.log.gz 2026-03-25T15:45:44.909 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85027.log 2026-03-25T15:45:44.909 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85006.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85006.log.gz 2026-03-25T15:45:44.909 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85047.log 2026-03-25T15:45:44.910 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85027.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85027.log.gz 2026-03-25T15:45:44.910 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85068.log 2026-03-25T15:45:44.910 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85047.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85047.log.gz 2026-03-25T15:45:44.911 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85088.log 2026-03-25T15:45:44.911 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85068.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85068.log.gz 2026-03-25T15:45:44.911 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85109.log 2026-03-25T15:45:44.912 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85088.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85088.log.gz 2026-03-25T15:45:44.912 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85132.log 2026-03-25T15:45:44.912 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85109.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85109.log.gz 2026-03-25T15:45:44.912 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85152.log 2026-03-25T15:45:44.913 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85132.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85132.log.gz 2026-03-25T15:45:44.913 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85173.log 2026-03-25T15:45:44.913 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85152.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85152.log.gz 2026-03-25T15:45:44.914 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85194.log 2026-03-25T15:45:44.914 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85173.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85173.log.gz 2026-03-25T15:45:44.914 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85215.log 2026-03-25T15:45:44.914 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85194.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85194.log.gz 2026-03-25T15:45:44.915 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85237.log 2026-03-25T15:45:44.915 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85215.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85215.log.gz 2026-03-25T15:45:44.915 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85258.log 2026-03-25T15:45:44.916 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85237.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85237.log.gz 2026-03-25T15:45:44.916 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85280.log 2026-03-25T15:45:44.916 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85258.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85258.log.gz 2026-03-25T15:45:44.916 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85301.log 2026-03-25T15:45:44.917 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85280.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85280.log.gz 2026-03-25T15:45:44.917 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85323.log 2026-03-25T15:45:44.917 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85301.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85301.log.gz 2026-03-25T15:45:44.918 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85344.log 2026-03-25T15:45:44.918 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85323.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85323.log.gz 2026-03-25T15:45:44.918 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85366.log 2026-03-25T15:45:44.919 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85344.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85344.log.gz 2026-03-25T15:45:44.919 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85387.log 2026-03-25T15:45:44.919 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85366.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85366.log.gz 2026-03-25T15:45:44.919 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85409.log 2026-03-25T15:45:44.920 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85387.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85387.log.gz 2026-03-25T15:45:44.920 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85430.log 2026-03-25T15:45:44.920 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85409.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85409.log.gz 2026-03-25T15:45:44.921 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85452.log 2026-03-25T15:45:44.921 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85430.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85430.log.gz 2026-03-25T15:45:44.921 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85473.log 2026-03-25T15:45:44.921 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85452.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85452.log.gz 2026-03-25T15:45:44.922 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85495.log 2026-03-25T15:45:44.922 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85473.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85473.log.gz 2026-03-25T15:45:44.922 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85516.log 2026-03-25T15:45:44.923 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85495.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85495.log.gz 2026-03-25T15:45:44.923 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85538.log 2026-03-25T15:45:44.923 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85516.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85516.log.gz 2026-03-25T15:45:44.924 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85559.log 2026-03-25T15:45:44.924 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85538.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85538.log.gz 2026-03-25T15:45:44.924 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85581.log 2026-03-25T15:45:44.924 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85559.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85559.log.gz 2026-03-25T15:45:44.925 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85602.log 2026-03-25T15:45:44.925 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85581.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85581.log.gz 2026-03-25T15:45:44.925 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85624.log 2026-03-25T15:45:44.926 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85602.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85602.log.gz 2026-03-25T15:45:44.926 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85645.log 2026-03-25T15:45:44.926 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85624.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85624.log.gz 2026-03-25T15:45:44.926 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85667.log 2026-03-25T15:45:44.927 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85645.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85645.log.gz 2026-03-25T15:45:44.927 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85688.log 2026-03-25T15:45:44.927 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85667.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85667.log.gz 2026-03-25T15:45:44.928 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85710.log 2026-03-25T15:45:44.928 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85688.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85688.log.gz 2026-03-25T15:45:44.928 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85731.log 2026-03-25T15:45:44.928 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85710.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85710.log.gz 2026-03-25T15:45:44.929 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85753.log 2026-03-25T15:45:44.929 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85731.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85731.log.gz 2026-03-25T15:45:44.929 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85774.log 2026-03-25T15:45:44.930 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85753.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85753.log.gz 2026-03-25T15:45:44.930 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85796.log 2026-03-25T15:45:44.930 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85774.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85774.log.gz 2026-03-25T15:45:44.930 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85817.log 2026-03-25T15:45:44.931 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85796.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85796.log.gz 2026-03-25T15:45:44.931 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85839.log 2026-03-25T15:45:44.931 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85817.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85817.log.gz 2026-03-25T15:45:44.932 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85860.log 2026-03-25T15:45:44.932 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85839.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85839.log.gz 2026-03-25T15:45:44.932 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85882.log 2026-03-25T15:45:44.932 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85860.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85860.log.gz 2026-03-25T15:45:44.933 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85903.log 2026-03-25T15:45:44.933 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85882.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85882.log.gz 2026-03-25T15:45:44.933 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85925.log 2026-03-25T15:45:44.934 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85903.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85903.log.gz 2026-03-25T15:45:44.934 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85946.log 2026-03-25T15:45:44.934 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85925.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85925.log.gz 2026-03-25T15:45:44.934 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85967.log 2026-03-25T15:45:44.935 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85946.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85946.log.gz 2026-03-25T15:45:44.935 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85990.log 2026-03-25T15:45:44.935 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85967.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.85967.log.gz 2026-03-25T15:45:44.935 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86014.log 2026-03-25T15:45:44.936 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.85990.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85990.log.gz 2026-03-25T15:45:44.936 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86034.log 2026-03-25T15:45:44.936 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86014.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86014.log.gz 2026-03-25T15:45:44.937 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86054.log 2026-03-25T15:45:44.937 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86034.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86034.log.gz 2026-03-25T15:45:44.937 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86074.log 2026-03-25T15:45:44.937 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86054.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86054.log.gz 2026-03-25T15:45:44.938 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86135.log 2026-03-25T15:45:44.938 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86074.log: 57.7% -- replaced with /var/log/ceph/ceph-client.admin.86074.log.gz 2026-03-25T15:45:44.938 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86159.log 2026-03-25T15:45:44.939 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86135.log: 53.8% -- replaced with /var/log/ceph/ceph-client.admin.86135.log.gz 2026-03-25T15:45:44.939 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86179.log 2026-03-25T15:45:44.939 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86159.log: 58.1% -- replaced with /var/log/ceph/ceph-client.admin.86159.log.gz 2026-03-25T15:45:44.939 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86199.log 2026-03-25T15:45:44.940 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86179.log: 58.1% -- replaced with /var/log/ceph/ceph-client.admin.86179.log.gz 2026-03-25T15:45:44.940 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86220.log 2026-03-25T15:45:44.940 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86199.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86199.log.gz 2026-03-25T15:45:44.941 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86241.log 2026-03-25T15:45:44.941 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86220.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86220.log.gz 2026-03-25T15:45:44.941 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86263.log 2026-03-25T15:45:44.942 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86241.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86241.log.gz 2026-03-25T15:45:44.942 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86284.log 2026-03-25T15:45:44.942 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86263.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86263.log.gz 2026-03-25T15:45:44.943 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86306.log 2026-03-25T15:45:44.943 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86284.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86284.log.gz 2026-03-25T15:45:44.943 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86327.log 2026-03-25T15:45:44.943 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86306.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86306.log.gz 2026-03-25T15:45:44.944 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86349.log 2026-03-25T15:45:44.944 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86327.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86327.log.gz 2026-03-25T15:45:44.944 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86370.log 2026-03-25T15:45:44.945 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86349.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86349.log.gz 2026-03-25T15:45:44.945 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86392.log 2026-03-25T15:45:44.945 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86370.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86370.log.gz 2026-03-25T15:45:44.945 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86413.log 2026-03-25T15:45:44.946 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86392.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86392.log.gz 2026-03-25T15:45:44.946 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86435.log 2026-03-25T15:45:44.946 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86413.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86413.log.gz 2026-03-25T15:45:44.946 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86456.log 2026-03-25T15:45:44.947 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86435.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86435.log.gz 2026-03-25T15:45:44.947 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86478.log 2026-03-25T15:45:44.947 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86456.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86456.log.gz 2026-03-25T15:45:44.948 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86499.log 2026-03-25T15:45:44.948 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86478.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86478.log.gz 2026-03-25T15:45:44.948 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86521.log 2026-03-25T15:45:44.948 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86499.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86499.log.gz 2026-03-25T15:45:44.949 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86542.log 2026-03-25T15:45:44.949 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86521.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86521.log.gz 2026-03-25T15:45:44.949 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86564.log 2026-03-25T15:45:44.950 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86542.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86542.log.gz 2026-03-25T15:45:44.950 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86585.log 2026-03-25T15:45:44.950 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86564.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86564.log.gz 2026-03-25T15:45:44.950 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86607.log 2026-03-25T15:45:44.951 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86585.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86585.log.gz 2026-03-25T15:45:44.951 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86628.log 2026-03-25T15:45:44.951 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86607.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86607.log.gz 2026-03-25T15:45:44.951 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86650.log 2026-03-25T15:45:44.952 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86628.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86628.log.gz 2026-03-25T15:45:44.952 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86671.log 2026-03-25T15:45:44.952 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86650.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86650.log.gz 2026-03-25T15:45:44.953 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86693.log 2026-03-25T15:45:44.953 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86671.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86671.log.gz 2026-03-25T15:45:44.953 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86714.log 2026-03-25T15:45:44.954 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86693.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86693.log.gz 2026-03-25T15:45:44.954 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86736.log 2026-03-25T15:45:44.954 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86714.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86714.log.gz 2026-03-25T15:45:44.954 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86757.log 2026-03-25T15:45:44.955 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86736.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86736.log.gz 2026-03-25T15:45:44.955 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86779.log 2026-03-25T15:45:44.955 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86757.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86757.log.gz 2026-03-25T15:45:44.956 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86800.log 2026-03-25T15:45:44.956 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86779.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86779.log.gz 2026-03-25T15:45:44.956 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86822.log 2026-03-25T15:45:44.956 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86800.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86800.log.gz 2026-03-25T15:45:44.957 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86843.log 2026-03-25T15:45:44.957 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86822.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86822.log.gz 2026-03-25T15:45:44.957 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86865.log 2026-03-25T15:45:44.958 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86843.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86843.log.gz 2026-03-25T15:45:44.958 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86886.log 2026-03-25T15:45:44.958 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86865.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86865.log.gz 2026-03-25T15:45:44.958 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86908.log 2026-03-25T15:45:44.959 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86886.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86886.log.gz 2026-03-25T15:45:44.959 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86929.log 2026-03-25T15:45:44.959 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86908.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86908.log.gz 2026-03-25T15:45:44.959 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86951.log 2026-03-25T15:45:44.960 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86929.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86929.log.gz 2026-03-25T15:45:44.960 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86972.log 2026-03-25T15:45:44.960 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86951.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86951.log.gz 2026-03-25T15:45:44.961 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86999.log 2026-03-25T15:45:44.961 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86972.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86972.log.gz 2026-03-25T15:45:44.961 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87019.log 2026-03-25T15:45:44.962 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.86999.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86999.log.gz 2026-03-25T15:45:44.962 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87039.log 2026-03-25T15:45:44.962 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87019.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87019.log.gz 2026-03-25T15:45:44.963 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87059.log 2026-03-25T15:45:44.963 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87039.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87039.log.gz 2026-03-25T15:45:44.963 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87079.log 2026-03-25T15:45:44.963 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87059.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87059.log.gz 2026-03-25T15:45:44.964 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87099.log 2026-03-25T15:45:44.964 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87079.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87079.log.gz 2026-03-25T15:45:44.964 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87119.log 2026-03-25T15:45:44.965 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87099.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87099.log.gz 2026-03-25T15:45:44.965 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87139.log 2026-03-25T15:45:44.965 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87119.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87119.log.gz 2026-03-25T15:45:44.965 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87160.log 2026-03-25T15:45:44.966 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87139.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87139.log.gz 2026-03-25T15:45:44.966 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87182.log 2026-03-25T15:45:44.966 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87160.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87160.log.gz 2026-03-25T15:45:44.966 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87202.log 2026-03-25T15:45:44.967 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87182.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87182.log.gz 2026-03-25T15:45:44.967 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87222.log 2026-03-25T15:45:44.967 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87202.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87202.log.gz 2026-03-25T15:45:44.968 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87239.log 2026-03-25T15:45:44.968 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87222.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87222.log.gz 2026-03-25T15:45:44.968 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87259.log 2026-03-25T15:45:44.968 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87239.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87239.log.gz 2026-03-25T15:45:44.969 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87279.log 2026-03-25T15:45:44.969 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87259.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87259.log.gz 2026-03-25T15:45:44.969 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87300.log 2026-03-25T15:45:44.970 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87279.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87279.log.gz 2026-03-25T15:45:44.970 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87321.log 2026-03-25T15:45:44.970 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87300.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87300.log.gz 2026-03-25T15:45:44.970 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87342.log 2026-03-25T15:45:44.971 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87321.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87321.log.gz 2026-03-25T15:45:44.971 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87363.log 2026-03-25T15:45:44.971 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87342.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87342.log.gz 2026-03-25T15:45:44.972 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87383.log 2026-03-25T15:45:44.972 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87363.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87363.log.gz 2026-03-25T15:45:44.972 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87403.log 2026-03-25T15:45:44.972 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87383.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87383.log.gz 2026-03-25T15:45:44.973 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87423.log 2026-03-25T15:45:44.973 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87403.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87403.log.gz 2026-03-25T15:45:44.973 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87443.log 2026-03-25T15:45:44.974 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87423.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87423.log.gz 2026-03-25T15:45:44.974 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87464.log 2026-03-25T15:45:44.974 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87443.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87443.log.gz 2026-03-25T15:45:44.974 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87485.log 2026-03-25T15:45:44.975 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87464.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87464.log.gz 2026-03-25T15:45:44.975 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87506.log 2026-03-25T15:45:44.975 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87485.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87485.log.gz 2026-03-25T15:45:44.976 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87527.log 2026-03-25T15:45:44.976 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87506.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87506.log.gz 2026-03-25T15:45:44.976 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87547.log 2026-03-25T15:45:44.977 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87527.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87527.log.gz 2026-03-25T15:45:44.977 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87567.log 2026-03-25T15:45:44.977 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87547.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87547.log.gz 2026-03-25T15:45:44.977 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87588.log 2026-03-25T15:45:44.978 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87567.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87567.log.gz 2026-03-25T15:45:44.978 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87609.log 2026-03-25T15:45:44.978 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87588.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87588.log.gz 2026-03-25T15:45:44.979 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87632.log 2026-03-25T15:45:44.979 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87609.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87609.log.gz 2026-03-25T15:45:44.979 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87653.log 2026-03-25T15:45:44.979 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87632.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87632.log.gz 2026-03-25T15:45:44.980 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87673.log 2026-03-25T15:45:44.980 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87653.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87653.log.gz 2026-03-25T15:45:44.980 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87693.log 2026-03-25T15:45:44.981 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87673.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87673.log.gz 2026-03-25T15:45:44.981 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87713.log 2026-03-25T15:45:44.981 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87693.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87693.log.gz 2026-03-25T15:45:44.982 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87733.log 2026-03-25T15:45:44.982 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87713.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87713.log.gz 2026-03-25T15:45:44.982 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87753.log 2026-03-25T15:45:44.983 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87733.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87733.log.gz 2026-03-25T15:45:44.983 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87777.log 2026-03-25T15:45:44.984 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87753.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87753.log.gz 2026-03-25T15:45:44.984 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87797.log 2026-03-25T15:45:44.984 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87777.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87777.log.gz 2026-03-25T15:45:44.985 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87817.log 2026-03-25T15:45:44.985 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87797.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87797.log.gz 2026-03-25T15:45:44.985 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87838.log 2026-03-25T15:45:44.986 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87817.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87817.log.gz 2026-03-25T15:45:44.986 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87859.log 2026-03-25T15:45:44.986 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87838.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87838.log.gz 2026-03-25T15:45:44.987 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87879.log 2026-03-25T15:45:44.987 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87859.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87859.log.gz 2026-03-25T15:45:44.987 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87901.log 2026-03-25T15:45:44.988 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87879.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87879.log.gz 2026-03-25T15:45:44.988 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87922.log 2026-03-25T15:45:44.988 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87901.log: 25.0% -- replaced with /var/log/ceph/ceph-client.admin.87901.log.gz 2026-03-25T15:45:44.988 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87943.log 2026-03-25T15:45:44.989 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87922.log: 25.7% -- replaced with /var/log/ceph/ceph-client.admin.87922.log.gz 2026-03-25T15:45:44.989 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87960.log 2026-03-25T15:45:44.989 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87943.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.87943.log.gz 2026-03-25T15:45:44.990 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87980.log 2026-03-25T15:45:44.990 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87960.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87960.log.gz 2026-03-25T15:45:44.990 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88000.log 2026-03-25T15:45:44.991 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.87980.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87980.log.gz 2026-03-25T15:45:44.991 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88020.log 2026-03-25T15:45:44.991 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88000.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88000.log.gz 2026-03-25T15:45:44.992 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88040.log 2026-03-25T15:45:44.992 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88020.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88020.log.gz 2026-03-25T15:45:44.992 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88061.log 2026-03-25T15:45:44.992 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88040.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88040.log.gz 2026-03-25T15:45:44.993 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88082.log 2026-03-25T15:45:44.993 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88061.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88061.log.gz 2026-03-25T15:45:44.993 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88103.log 2026-03-25T15:45:44.994 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88082.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88082.log.gz 2026-03-25T15:45:44.994 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88124.log 2026-03-25T15:45:44.994 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88103.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88103.log.gz 2026-03-25T15:45:44.994 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88146.log 2026-03-25T15:45:44.995 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88124.log: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.88124.log.gz 2026-03-25T15:45:44.995 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88167.log 2026-03-25T15:45:44.995 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88146.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88146.log.gz 2026-03-25T15:45:44.995 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88188.log 2026-03-25T15:45:44.996 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88167.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88167.log.gz 2026-03-25T15:45:44.996 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88209.log 2026-03-25T15:45:44.996 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88188.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88188.log.gz 2026-03-25T15:45:44.997 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88231.log 2026-03-25T15:45:44.997 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88209.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88209.log.gz 2026-03-25T15:45:44.997 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88252.log 2026-03-25T15:45:44.998 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88231.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88231.log.gz 2026-03-25T15:45:44.998 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88274.log 2026-03-25T15:45:44.998 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88252.log: 25.3% -- replaced with /var/log/ceph/ceph-client.admin.88252.log.gz 2026-03-25T15:45:44.998 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88295.log 2026-03-25T15:45:44.999 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88274.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88274.log.gz 2026-03-25T15:45:44.999 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88317.log 2026-03-25T15:45:44.999 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88295.log: 27.5% -- replaced with /var/log/ceph/ceph-client.admin.88295.log.gz 2026-03-25T15:45:44.999 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88338.log 2026-03-25T15:45:45.000 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88317.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88317.log.gz 2026-03-25T15:45:45.000 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88360.log 2026-03-25T15:45:45.000 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88338.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88338.log.gz 2026-03-25T15:45:45.001 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88381.log 2026-03-25T15:45:45.001 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88360.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88360.log.gz 2026-03-25T15:45:45.001 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88403.log 2026-03-25T15:45:45.002 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88381.log: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.88381.log.gz 2026-03-25T15:45:45.002 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88424.log 2026-03-25T15:45:45.002 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88403.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88403.log.gz 2026-03-25T15:45:45.003 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88446.log 2026-03-25T15:45:45.003 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88424.log: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.88424.log.gz 2026-03-25T15:45:45.003 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88467.log 2026-03-25T15:45:45.003 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88446.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88446.log.gz 2026-03-25T15:45:45.004 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88489.log 2026-03-25T15:45:45.004 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88467.log: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.88467.log.gz 2026-03-25T15:45:45.004 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88510.log 2026-03-25T15:45:45.005 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88489.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88489.log.gz 2026-03-25T15:45:45.005 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88532.log 2026-03-25T15:45:45.005 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88510.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88510.log.gz 2026-03-25T15:45:45.005 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88553.log 2026-03-25T15:45:45.006 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88532.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88532.log.gz 2026-03-25T15:45:45.006 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88575.log 2026-03-25T15:45:45.006 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88553.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88553.log.gz 2026-03-25T15:45:45.007 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88596.log 2026-03-25T15:45:45.007 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88575.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88575.log.gz 2026-03-25T15:45:45.007 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88618.log 2026-03-25T15:45:45.007 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88596.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88596.log.gz 2026-03-25T15:45:45.008 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88639.log 2026-03-25T15:45:45.008 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88618.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88618.log.gz 2026-03-25T15:45:45.008 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88661.log 2026-03-25T15:45:45.009 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88639.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88639.log.gz 2026-03-25T15:45:45.009 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88682.log 2026-03-25T15:45:45.009 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88661.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88661.log.gz 2026-03-25T15:45:45.009 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88704.log 2026-03-25T15:45:45.010 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88682.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88682.log.gz 2026-03-25T15:45:45.010 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88725.log 2026-03-25T15:45:45.010 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88704.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88704.log.gz 2026-03-25T15:45:45.010 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88747.log 2026-03-25T15:45:45.011 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88725.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88725.log.gz 2026-03-25T15:45:45.011 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88768.log 2026-03-25T15:45:45.011 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88747.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88747.log.gz 2026-03-25T15:45:45.012 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88790.log 2026-03-25T15:45:45.012 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88768.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88768.log.gz 2026-03-25T15:45:45.012 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88811.log 2026-03-25T15:45:45.012 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88790.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88790.log.gz 2026-03-25T15:45:45.013 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88833.log 2026-03-25T15:45:45.013 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88811.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88811.log.gz 2026-03-25T15:45:45.013 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88854.log 2026-03-25T15:45:45.013 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88833.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88833.log.gz 2026-03-25T15:45:45.014 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88877.log 2026-03-25T15:45:45.014 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88854.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88854.log.gz 2026-03-25T15:45:45.014 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88898.log 2026-03-25T15:45:45.015 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88877.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88877.log.gz 2026-03-25T15:45:45.015 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88920.log 2026-03-25T15:45:45.015 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88898.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88898.log.gz 2026-03-25T15:45:45.015 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88941.log 2026-03-25T15:45:45.016 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88920.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88920.log.gz 2026-03-25T15:45:45.016 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88963.log 2026-03-25T15:45:45.016 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88941.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88941.log.gz 2026-03-25T15:45:45.017 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88984.log 2026-03-25T15:45:45.017 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88963.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88963.log.gz 2026-03-25T15:45:45.017 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89006.log 2026-03-25T15:45:45.018 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.88984.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88984.log.gz 2026-03-25T15:45:45.018 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89027.log 2026-03-25T15:45:45.018 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89006.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89006.log.gz 2026-03-25T15:45:45.018 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89049.log 2026-03-25T15:45:45.019 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89027.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89027.log.gz 2026-03-25T15:45:45.019 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89070.log 2026-03-25T15:45:45.019 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89049.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89049.log.gz 2026-03-25T15:45:45.019 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89092.log 2026-03-25T15:45:45.020 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89070.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89070.log.gz 2026-03-25T15:45:45.020 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89113.log 2026-03-25T15:45:45.020 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89092.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89092.log.gz 2026-03-25T15:45:45.021 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89135.log 2026-03-25T15:45:45.021 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89113.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89113.log.gz 2026-03-25T15:45:45.021 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89156.log 2026-03-25T15:45:45.022 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89135.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89135.log.gz 2026-03-25T15:45:45.022 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89178.log 2026-03-25T15:45:45.022 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89156.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89156.log.gz 2026-03-25T15:45:45.023 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89199.log 2026-03-25T15:45:45.023 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89178.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89178.log.gz 2026-03-25T15:45:45.023 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89221.log 2026-03-25T15:45:45.023 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89199.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89199.log.gz 2026-03-25T15:45:45.024 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89242.log 2026-03-25T15:45:45.024 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89221.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89221.log.gz 2026-03-25T15:45:45.024 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89264.log 2026-03-25T15:45:45.025 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89242.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89242.log.gz 2026-03-25T15:45:45.025 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89285.log 2026-03-25T15:45:45.025 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89264.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89264.log.gz 2026-03-25T15:45:45.025 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89307.log 2026-03-25T15:45:45.026 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89285.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89285.log.gz 2026-03-25T15:45:45.026 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89328.log 2026-03-25T15:45:45.026 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89307.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89307.log.gz 2026-03-25T15:45:45.026 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89350.log 2026-03-25T15:45:45.027 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89328.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89328.log.gz 2026-03-25T15:45:45.027 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89371.log 2026-03-25T15:45:45.027 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89350.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89350.log.gz 2026-03-25T15:45:45.027 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89393.log 2026-03-25T15:45:45.028 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89371.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89371.log.gz 2026-03-25T15:45:45.028 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89414.log 2026-03-25T15:45:45.028 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89393.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89393.log.gz 2026-03-25T15:45:45.029 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89436.log 2026-03-25T15:45:45.029 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89414.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89414.log.gz 2026-03-25T15:45:45.029 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89457.log 2026-03-25T15:45:45.030 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89436.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89436.log.gz 2026-03-25T15:45:45.030 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89479.log 2026-03-25T15:45:45.030 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89457.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89457.log.gz 2026-03-25T15:45:45.030 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89500.log 2026-03-25T15:45:45.031 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89479.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89479.log.gz 2026-03-25T15:45:45.031 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89522.log 2026-03-25T15:45:45.031 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89500.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89500.log.gz 2026-03-25T15:45:45.031 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89543.log 2026-03-25T15:45:45.032 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89522.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89522.log.gz 2026-03-25T15:45:45.032 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89565.log 2026-03-25T15:45:45.032 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89543.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89543.log.gz 2026-03-25T15:45:45.033 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89586.log 2026-03-25T15:45:45.033 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89565.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89565.log.gz 2026-03-25T15:45:45.033 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89608.log 2026-03-25T15:45:45.033 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89586.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89586.log.gz 2026-03-25T15:45:45.034 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89629.log 2026-03-25T15:45:45.034 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89608.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89608.log.gz 2026-03-25T15:45:45.034 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89650.log 2026-03-25T15:45:45.035 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89629.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89629.log.gz 2026-03-25T15:45:45.035 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89667.log 2026-03-25T15:45:45.035 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89650.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89650.log.gz 2026-03-25T15:45:45.035 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89689.log 2026-03-25T15:45:45.036 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89667.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89667.log.gz 2026-03-25T15:45:45.036 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89709.log 2026-03-25T15:45:45.036 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89689.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89689.log.gz 2026-03-25T15:45:45.036 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89731.log 2026-03-25T15:45:45.037 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89709.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89709.log.gz 2026-03-25T15:45:45.037 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89753.log 2026-03-25T15:45:45.037 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89731.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89731.log.gz 2026-03-25T15:45:45.038 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89773.log 2026-03-25T15:45:45.038 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89753.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89753.log.gz 2026-03-25T15:45:45.038 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89795.log 2026-03-25T15:45:45.038 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89773.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89773.log.gz 2026-03-25T15:45:45.039 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89815.log 2026-03-25T15:45:45.039 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89795.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89795.log.gz 2026-03-25T15:45:45.039 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89836.log 2026-03-25T15:45:45.040 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89815.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89815.log.gz 2026-03-25T15:45:45.040 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89856.log 2026-03-25T15:45:45.040 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89836.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89836.log.gz 2026-03-25T15:45:45.041 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89877.log 2026-03-25T15:45:45.041 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89856.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89856.log.gz 2026-03-25T15:45:45.041 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89897.log 2026-03-25T15:45:45.041 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89877.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89877.log.gz 2026-03-25T15:45:45.042 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89918.log 2026-03-25T15:45:45.042 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89897.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89897.log.gz 2026-03-25T15:45:45.042 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89941.log 2026-03-25T15:45:45.043 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89918.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89918.log.gz 2026-03-25T15:45:45.043 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89961.log 2026-03-25T15:45:45.043 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89941.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89941.log.gz 2026-03-25T15:45:45.043 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89982.log 2026-03-25T15:45:45.044 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89961.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89961.log.gz 2026-03-25T15:45:45.044 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90002.log 2026-03-25T15:45:45.044 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.89982.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89982.log.gz 2026-03-25T15:45:45.044 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90025.log 2026-03-25T15:45:45.045 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90002.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90002.log.gz 2026-03-25T15:45:45.045 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90045.log 2026-03-25T15:45:45.045 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90025.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90025.log.gz 2026-03-25T15:45:45.046 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90065.log 2026-03-25T15:45:45.046 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90045.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90045.log.gz 2026-03-25T15:45:45.046 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90088.log 2026-03-25T15:45:45.046 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90065.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90065.log.gz 2026-03-25T15:45:45.047 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90108.log 2026-03-25T15:45:45.047 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90088.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90088.log.gz 2026-03-25T15:45:45.047 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90128.log 2026-03-25T15:45:45.048 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90108.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90108.log.gz 2026-03-25T15:45:45.048 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90150.log 2026-03-25T15:45:45.048 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90128.log: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.90128.log.gz 2026-03-25T15:45:45.048 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90171.log 2026-03-25T15:45:45.049 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90150.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90150.log.gz 2026-03-25T15:45:45.049 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90193.log 2026-03-25T15:45:45.049 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90171.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90171.log.gz 2026-03-25T15:45:45.050 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90214.log 2026-03-25T15:45:45.050 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90193.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90193.log.gz 2026-03-25T15:45:45.050 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90236.log 2026-03-25T15:45:45.050 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90214.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90214.log.gz 2026-03-25T15:45:45.051 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90257.log 2026-03-25T15:45:45.051 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90236.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90236.log.gz 2026-03-25T15:45:45.051 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90279.log 2026-03-25T15:45:45.052 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90257.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90257.log.gz 2026-03-25T15:45:45.052 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90300.log 2026-03-25T15:45:45.052 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90279.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90279.log.gz 2026-03-25T15:45:45.052 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90322.log 2026-03-25T15:45:45.053 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90300.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90300.log.gz 2026-03-25T15:45:45.053 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90343.log 2026-03-25T15:45:45.053 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90322.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90322.log.gz 2026-03-25T15:45:45.053 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90365.log 2026-03-25T15:45:45.054 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90343.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90343.log.gz 2026-03-25T15:45:45.054 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90386.log 2026-03-25T15:45:45.054 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90365.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90365.log.gz 2026-03-25T15:45:45.055 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90408.log 2026-03-25T15:45:45.055 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90386.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90386.log.gz 2026-03-25T15:45:45.055 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90429.log 2026-03-25T15:45:45.055 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90408.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90408.log.gz 2026-03-25T15:45:45.056 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90451.log 2026-03-25T15:45:45.056 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90429.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90429.log.gz 2026-03-25T15:45:45.056 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90472.log 2026-03-25T15:45:45.057 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90451.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90451.log.gz 2026-03-25T15:45:45.057 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90494.log 2026-03-25T15:45:45.057 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90472.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90472.log.gz 2026-03-25T15:45:45.057 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90515.log 2026-03-25T15:45:45.058 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90494.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90494.log.gz 2026-03-25T15:45:45.058 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90537.log 2026-03-25T15:45:45.058 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90515.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90515.log.gz 2026-03-25T15:45:45.059 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90558.log 2026-03-25T15:45:45.059 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90537.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90537.log.gz 2026-03-25T15:45:45.059 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90580.log 2026-03-25T15:45:45.060 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90558.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90558.log.gz 2026-03-25T15:45:45.060 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90601.log 2026-03-25T15:45:45.060 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90580.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90580.log.gz 2026-03-25T15:45:45.060 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90623.log 2026-03-25T15:45:45.061 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90601.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90601.log.gz 2026-03-25T15:45:45.061 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90644.log 2026-03-25T15:45:45.061 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90623.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90623.log.gz 2026-03-25T15:45:45.062 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90666.log 2026-03-25T15:45:45.062 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90644.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90644.log.gz 2026-03-25T15:45:45.062 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90687.log 2026-03-25T15:45:45.062 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90666.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90666.log.gz 2026-03-25T15:45:45.063 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90709.log 2026-03-25T15:45:45.063 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90687.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90687.log.gz 2026-03-25T15:45:45.063 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90730.log 2026-03-25T15:45:45.063 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90709.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90709.log.gz 2026-03-25T15:45:45.064 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90752.log 2026-03-25T15:45:45.064 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90730.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90730.log.gz 2026-03-25T15:45:45.064 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90773.log 2026-03-25T15:45:45.065 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90752.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90752.log.gz 2026-03-25T15:45:45.065 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90795.log 2026-03-25T15:45:45.065 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90773.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90773.log.gz 2026-03-25T15:45:45.065 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90816.log 2026-03-25T15:45:45.066 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90795.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90795.log.gz 2026-03-25T15:45:45.066 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90838.log 2026-03-25T15:45:45.066 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90816.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90816.log.gz 2026-03-25T15:45:45.066 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90859.log 2026-03-25T15:45:45.067 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90838.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90838.log.gz 2026-03-25T15:45:45.067 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90881.log 2026-03-25T15:45:45.067 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90859.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90859.log.gz 2026-03-25T15:45:45.068 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90902.log 2026-03-25T15:45:45.068 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90881.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90881.log.gz 2026-03-25T15:45:45.068 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90923.log 2026-03-25T15:45:45.068 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90902.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90902.log.gz 2026-03-25T15:45:45.069 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90940.log 2026-03-25T15:45:45.069 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90923.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90923.log.gz 2026-03-25T15:45:45.069 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90960.log 2026-03-25T15:45:45.070 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90940.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90940.log.gz 2026-03-25T15:45:45.070 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90980.log 2026-03-25T15:45:45.070 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90960.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90960.log.gz 2026-03-25T15:45:45.070 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91006.log 2026-03-25T15:45:45.071 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.90980.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90980.log.gz 2026-03-25T15:45:45.071 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91023.log 2026-03-25T15:45:45.071 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91006.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91006.log.gz 2026-03-25T15:45:45.071 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91043.log 2026-03-25T15:45:45.072 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91023.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91023.log.gz 2026-03-25T15:45:45.072 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91061.log 2026-03-25T15:45:45.072 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91043.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91043.log.gz 2026-03-25T15:45:45.073 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91083.log 2026-03-25T15:45:45.073 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91061.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91061.log.gz 2026-03-25T15:45:45.073 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91100.log 2026-03-25T15:45:45.073 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91083.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91083.log.gz 2026-03-25T15:45:45.074 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91120.log 2026-03-25T15:45:45.074 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91100.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91100.log.gz 2026-03-25T15:45:45.074 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91140.log 2026-03-25T15:45:45.074 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91120.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91120.log.gz 2026-03-25T15:45:45.075 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91160.log 2026-03-25T15:45:45.075 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91140.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91140.log.gz 2026-03-25T15:45:45.075 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91180.log 2026-03-25T15:45:45.076 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91160.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91160.log.gz 2026-03-25T15:45:45.076 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91198.log 2026-03-25T15:45:45.076 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91180.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91180.log.gz 2026-03-25T15:45:45.076 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91220.log 2026-03-25T15:45:45.077 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91198.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91198.log.gz 2026-03-25T15:45:45.077 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91237.log 2026-03-25T15:45:45.077 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91220.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91220.log.gz 2026-03-25T15:45:45.078 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91240.log 2026-03-25T15:45:45.078 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91237.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91237.log.gz 2026-03-25T15:45:45.078 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91261.log 2026-03-25T15:45:45.079 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91240.log: 50.8% -- replaced with /var/log/ceph/ceph-client.admin.91240.log.gz 2026-03-25T15:45:45.079 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91282.log 2026-03-25T15:45:45.079 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91261.log: 43.5% -- replaced with /var/log/ceph/ceph-client.admin.91261.log.gz 2026-03-25T15:45:45.080 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91286.log 2026-03-25T15:45:45.080 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91282.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91282.log.gz 2026-03-25T15:45:45.080 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91307.log 2026-03-25T15:45:45.080 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91286.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91286.log.gz 2026-03-25T15:45:45.081 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91327.log 2026-03-25T15:45:45.081 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91307.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91307.log.gz 2026-03-25T15:45:45.081 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91345.log 2026-03-25T15:45:45.082 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91327.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91327.log.gz 2026-03-25T15:45:45.082 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91367.log 2026-03-25T15:45:45.082 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91345.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91345.log.gz 2026-03-25T15:45:45.082 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91385.log 2026-03-25T15:45:45.083 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91367.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91367.log.gz 2026-03-25T15:45:45.083 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91406.log 2026-03-25T15:45:45.083 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91385.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91385.log.gz 2026-03-25T15:45:45.083 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91427.log 2026-03-25T15:45:45.084 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91406.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91406.log.gz 2026-03-25T15:45:45.084 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91448.log 2026-03-25T15:45:45.084 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91427.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91427.log.gz 2026-03-25T15:45:45.085 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91469.log 2026-03-25T15:45:45.085 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91448.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91448.log.gz 2026-03-25T15:45:45.085 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91489.log 2026-03-25T15:45:45.085 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91469.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91469.log.gz 2026-03-25T15:45:45.086 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91509.log 2026-03-25T15:45:45.086 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91489.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91489.log.gz 2026-03-25T15:45:45.086 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91530.log 2026-03-25T15:45:45.087 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91509.log: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.91509.log.gz 2026-03-25T15:45:45.087 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91551.log 2026-03-25T15:45:45.087 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91530.log: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.91530.log.gz 2026-03-25T15:45:45.087 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91571.log 2026-03-25T15:45:45.088 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91551.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91551.log.gz 2026-03-25T15:45:45.088 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91591.log 2026-03-25T15:45:45.088 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91571.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91571.log.gz 2026-03-25T15:45:45.089 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91612.log 2026-03-25T15:45:45.089 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91591.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.91591.log.gz 2026-03-25T15:45:45.089 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91637.log 2026-03-25T15:45:45.089 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91612.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91612.log.gz 2026-03-25T15:45:45.090 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91658.log 2026-03-25T15:45:45.090 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91637.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91637.log.gz 2026-03-25T15:45:45.090 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91680.log 2026-03-25T15:45:45.090 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91658.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91658.log.gz 2026-03-25T15:45:45.091 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91701.log 2026-03-25T15:45:45.091 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91680.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91680.log.gz 2026-03-25T15:45:45.091 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91723.log 2026-03-25T15:45:45.092 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91701.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91701.log.gz 2026-03-25T15:45:45.092 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91744.log 2026-03-25T15:45:45.092 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91723.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91723.log.gz 2026-03-25T15:45:45.092 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91766.log 2026-03-25T15:45:45.093 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91744.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91744.log.gz 2026-03-25T15:45:45.093 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91787.log 2026-03-25T15:45:45.093 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91766.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91766.log.gz 2026-03-25T15:45:45.093 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91809.log 2026-03-25T15:45:45.094 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91787.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91787.log.gz 2026-03-25T15:45:45.094 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91830.log 2026-03-25T15:45:45.094 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91809.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91809.log.gz 2026-03-25T15:45:45.095 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91852.log 2026-03-25T15:45:45.095 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91830.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91830.log.gz 2026-03-25T15:45:45.095 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91873.log 2026-03-25T15:45:45.096 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91852.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91852.log.gz 2026-03-25T15:45:45.096 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91895.log 2026-03-25T15:45:45.096 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91873.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91873.log.gz 2026-03-25T15:45:45.096 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91916.log 2026-03-25T15:45:45.097 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91895.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91895.log.gz 2026-03-25T15:45:45.097 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91938.log 2026-03-25T15:45:45.097 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91916.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91916.log.gz 2026-03-25T15:45:45.097 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91959.log 2026-03-25T15:45:45.098 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91938.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91938.log.gz 2026-03-25T15:45:45.098 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91981.log 2026-03-25T15:45:45.099 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91959.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91959.log.gz 2026-03-25T15:45:45.099 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92002.log 2026-03-25T15:45:45.099 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.91981.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91981.log.gz 2026-03-25T15:45:45.099 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92024.log 2026-03-25T15:45:45.100 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92002.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92002.log.gz 2026-03-25T15:45:45.100 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92045.log 2026-03-25T15:45:45.100 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92024.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92024.log.gz 2026-03-25T15:45:45.101 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92067.log 2026-03-25T15:45:45.101 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92045.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92045.log.gz 2026-03-25T15:45:45.101 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92088.log 2026-03-25T15:45:45.101 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92067.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92067.log.gz 2026-03-25T15:45:45.102 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92110.log 2026-03-25T15:45:45.102 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92088.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92088.log.gz 2026-03-25T15:45:45.102 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92131.log 2026-03-25T15:45:45.103 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92110.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92110.log.gz 2026-03-25T15:45:45.103 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92153.log 2026-03-25T15:45:45.103 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92131.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92131.log.gz 2026-03-25T15:45:45.103 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92174.log 2026-03-25T15:45:45.104 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92153.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92153.log.gz 2026-03-25T15:45:45.104 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92196.log 2026-03-25T15:45:45.104 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92174.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92174.log.gz 2026-03-25T15:45:45.104 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92217.log 2026-03-25T15:45:45.105 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92196.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92196.log.gz 2026-03-25T15:45:45.105 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92239.log 2026-03-25T15:45:45.105 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92217.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92217.log.gz 2026-03-25T15:45:45.106 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92260.log 2026-03-25T15:45:45.106 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92239.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92239.log.gz 2026-03-25T15:45:45.106 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92282.log 2026-03-25T15:45:45.106 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92260.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92260.log.gz 2026-03-25T15:45:45.107 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92303.log 2026-03-25T15:45:45.107 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92282.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92282.log.gz 2026-03-25T15:45:45.107 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92325.log 2026-03-25T15:45:45.108 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92303.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92303.log.gz 2026-03-25T15:45:45.108 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92346.log 2026-03-25T15:45:45.108 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92325.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92325.log.gz 2026-03-25T15:45:45.108 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92368.log 2026-03-25T15:45:45.109 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92346.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92346.log.gz 2026-03-25T15:45:45.109 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92389.log 2026-03-25T15:45:45.109 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92368.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92368.log.gz 2026-03-25T15:45:45.109 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92410.log 2026-03-25T15:45:45.110 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92389.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92389.log.gz 2026-03-25T15:45:45.110 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92427.log 2026-03-25T15:45:45.110 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92410.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92410.log.gz 2026-03-25T15:45:45.111 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92444.log 2026-03-25T15:45:45.111 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92427.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92427.log.gz 2026-03-25T15:45:45.111 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92462.log 2026-03-25T15:45:45.111 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92444.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92444.log.gz 2026-03-25T15:45:45.112 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92480.log 2026-03-25T15:45:45.112 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92462.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92462.log.gz 2026-03-25T15:45:45.112 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92499.log 2026-03-25T15:45:45.113 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92480.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92480.log.gz 2026-03-25T15:45:45.113 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92520.log 2026-03-25T15:45:45.113 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92499.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92499.log.gz 2026-03-25T15:45:45.113 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92541.log 2026-03-25T15:45:45.114 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92520.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92520.log.gz 2026-03-25T15:45:45.114 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92562.log 2026-03-25T15:45:45.114 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92541.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92541.log.gz 2026-03-25T15:45:45.114 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92580.log 2026-03-25T15:45:45.115 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92562.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92562.log.gz 2026-03-25T15:45:45.115 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92599.log 2026-03-25T15:45:45.115 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92580.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92580.log.gz 2026-03-25T15:45:45.116 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92620.log 2026-03-25T15:45:45.116 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92599.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92599.log.gz 2026-03-25T15:45:45.116 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92638.log 2026-03-25T15:45:45.116 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92620.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92620.log.gz 2026-03-25T15:45:45.117 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92657.log 2026-03-25T15:45:45.117 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92638.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92638.log.gz 2026-03-25T15:45:45.117 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92678.log 2026-03-25T15:45:45.118 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92657.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92657.log.gz 2026-03-25T15:45:45.118 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92700.log 2026-03-25T15:45:45.118 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92678.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92678.log.gz 2026-03-25T15:45:45.118 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92718.log 2026-03-25T15:45:45.119 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92700.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92700.log.gz 2026-03-25T15:45:45.119 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92739.log 2026-03-25T15:45:45.119 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92718.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92718.log.gz 2026-03-25T15:45:45.120 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92761.log 2026-03-25T15:45:45.120 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92739.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92739.log.gz 2026-03-25T15:45:45.120 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92779.log 2026-03-25T15:45:45.120 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92761.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92761.log.gz 2026-03-25T15:45:45.121 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92800.log 2026-03-25T15:45:45.121 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92779.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92779.log.gz 2026-03-25T15:45:45.121 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92822.log 2026-03-25T15:45:45.121 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92800.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92800.log.gz 2026-03-25T15:45:45.122 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92839.log 2026-03-25T15:45:45.122 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92822.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92822.log.gz 2026-03-25T15:45:45.122 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92857.log 2026-03-25T15:45:45.123 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92839.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92839.log.gz 2026-03-25T15:45:45.123 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92876.log 2026-03-25T15:45:45.123 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92857.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92857.log.gz 2026-03-25T15:45:45.123 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92897.log 2026-03-25T15:45:45.124 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92876.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92876.log.gz 2026-03-25T15:45:45.124 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92918.log 2026-03-25T15:45:45.124 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92897.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92897.log.gz 2026-03-25T15:45:45.124 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92936.log 2026-03-25T15:45:45.125 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92918.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92918.log.gz 2026-03-25T15:45:45.125 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92955.log 2026-03-25T15:45:45.125 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92936.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92936.log.gz 2026-03-25T15:45:45.126 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92976.log 2026-03-25T15:45:45.126 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92955.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92955.log.gz 2026-03-25T15:45:45.126 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.92994.log 2026-03-25T15:45:45.126 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92976.log: 10.7% -- replaced with /var/log/ceph/ceph-client.admin.92976.log.gz 2026-03-25T15:45:45.127 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93015.log 2026-03-25T15:45:45.127 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.92994.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.92994.log.gz 2026-03-25T15:45:45.127 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93032.log 2026-03-25T15:45:45.128 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93015.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.93015.log.gz 2026-03-25T15:45:45.128 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93052.log 2026-03-25T15:45:45.128 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93032.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.93032.log.gz 2026-03-25T15:45:45.128 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93072.log 2026-03-25T15:45:45.129 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93052.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.93052.log.gz 2026-03-25T15:45:45.129 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93092.log 2026-03-25T15:45:45.129 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93072.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.93072.log.gz 2026-03-25T15:45:45.129 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93113.log 2026-03-25T15:45:45.130 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93092.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.93092.log.gz 2026-03-25T15:45:45.130 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93131.log 2026-03-25T15:45:45.130 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93113.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.93113.log.gz 2026-03-25T15:45:45.130 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93150.log 2026-03-25T15:45:45.131 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93131.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.93131.log.gz 2026-03-25T15:45:45.131 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93171.log 2026-03-25T15:45:45.131 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93150.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.93150.log.gz 2026-03-25T15:45:45.132 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93193.log 2026-03-25T15:45:45.132 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93171.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.93171.log.gz 2026-03-25T15:45:45.132 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93211.log 2026-03-25T15:45:45.133 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93193.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.93193.log.gz 2026-03-25T15:45:45.133 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93234.log 2026-03-25T15:45:45.133 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93211.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.93211.log.gz 2026-03-25T15:45:45.134 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93255.log 2026-03-25T15:45:45.134 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93234.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.93234.log.gz 2026-03-25T15:45:45.135 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93277.log 2026-03-25T15:45:45.135 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93255.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.93255.log.gz 2026-03-25T15:45:45.135 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93298.log 2026-03-25T15:45:45.136 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93277.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.93277.log.gz 2026-03-25T15:45:45.136 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93319.log 2026-03-25T15:45:45.136 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93298.log: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.93298.log.gz 2026-03-25T15:45:45.137 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93339.log 2026-03-25T15:45:45.137 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93319.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.93319.log.gz 2026-03-25T15:45:45.137 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93359.log 2026-03-25T15:45:45.138 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93339.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.93339.log.gz 2026-03-25T15:45:45.138 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93382.log 2026-03-25T15:45:45.139 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93359.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.93359.log.gz 2026-03-25T15:45:45.139 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93399.log 2026-03-25T15:45:45.139 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93382.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.93382.log.gz 2026-03-25T15:45:45.140 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93419.log 2026-03-25T15:45:45.140 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93399.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.93399.log.gz 2026-03-25T15:45:45.140 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93439.log 2026-03-25T15:45:45.141 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93419.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.93419.log.gz 2026-03-25T15:45:45.141 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93462.log 2026-03-25T15:45:45.141 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93439.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.93439.log.gz 2026-03-25T15:45:45.142 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93482.log 2026-03-25T15:45:45.142 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93462.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.93462.log.gz 2026-03-25T15:45:45.142 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93505.log 2026-03-25T15:45:45.143 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93482.log: 30.3% -- replaced with /var/log/ceph/ceph-client.admin.93482.log.gz 2026-03-25T15:45:45.143 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93525.log 2026-03-25T15:45:45.143 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93505.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.93505.log.gz 2026-03-25T15:45:45.144 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93543.log 2026-03-25T15:45:45.144 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93525.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.93525.log.gz 2026-03-25T15:45:45.144 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93562.log 2026-03-25T15:45:45.144 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93543.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.93543.log.gz 2026-03-25T15:45:45.145 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93583.log 2026-03-25T15:45:45.145 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93562.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.93562.log.gz 2026-03-25T15:45:45.145 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93601.log 2026-03-25T15:45:45.146 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93583.log: 10.7% -- replaced with /var/log/ceph/ceph-client.admin.93583.log.gz 2026-03-25T15:45:45.146 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93623.log 2026-03-25T15:45:45.146 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93601.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.93601.log.gz 2026-03-25T15:45:45.147 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93644.log 2026-03-25T15:45:45.147 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93623.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.93623.log.gz 2026-03-25T15:45:45.147 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93666.log 2026-03-25T15:45:45.148 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93644.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.93644.log.gz 2026-03-25T15:45:45.148 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93687.log 2026-03-25T15:45:45.148 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93666.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.93666.log.gz 2026-03-25T15:45:45.148 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93709.log 2026-03-25T15:45:45.149 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93687.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.93687.log.gz 2026-03-25T15:45:45.149 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93730.log 2026-03-25T15:45:45.149 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93709.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.93709.log.gz 2026-03-25T15:45:45.150 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93752.log 2026-03-25T15:45:45.150 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93730.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.93730.log.gz 2026-03-25T15:45:45.150 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93773.log 2026-03-25T15:45:45.151 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93752.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.93752.log.gz 2026-03-25T15:45:45.151 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93795.log 2026-03-25T15:45:45.151 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93773.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.93773.log.gz 2026-03-25T15:45:45.151 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93816.log 2026-03-25T15:45:45.152 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93795.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.93795.log.gz 2026-03-25T15:45:45.152 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93838.log 2026-03-25T15:45:45.152 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93816.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.93816.log.gz 2026-03-25T15:45:45.152 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93859.log 2026-03-25T15:45:45.153 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93838.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.93838.log.gz 2026-03-25T15:45:45.153 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93881.log 2026-03-25T15:45:45.153 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93859.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.93859.log.gz 2026-03-25T15:45:45.154 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93902.log 2026-03-25T15:45:45.154 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93881.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.93881.log.gz 2026-03-25T15:45:45.154 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93924.log 2026-03-25T15:45:45.155 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93902.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.93902.log.gz 2026-03-25T15:45:45.155 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93945.log 2026-03-25T15:45:45.155 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93924.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.93924.log.gz 2026-03-25T15:45:45.156 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93967.log 2026-03-25T15:45:45.156 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93945.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.93945.log.gz 2026-03-25T15:45:45.156 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.93988.log 2026-03-25T15:45:45.157 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93967.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.93967.log.gz 2026-03-25T15:45:45.157 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94010.log 2026-03-25T15:45:45.157 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.93988.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.93988.log.gz 2026-03-25T15:45:45.157 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94031.log 2026-03-25T15:45:45.158 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94010.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94010.log.gz 2026-03-25T15:45:45.158 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94053.log 2026-03-25T15:45:45.158 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94031.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94031.log.gz 2026-03-25T15:45:45.159 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94074.log 2026-03-25T15:45:45.159 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94053.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94053.log.gz 2026-03-25T15:45:45.159 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94096.log 2026-03-25T15:45:45.160 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94074.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94074.log.gz 2026-03-25T15:45:45.160 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94117.log 2026-03-25T15:45:45.160 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94096.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94096.log.gz 2026-03-25T15:45:45.160 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94139.log 2026-03-25T15:45:45.161 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94117.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94117.log.gz 2026-03-25T15:45:45.161 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94160.log 2026-03-25T15:45:45.161 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94139.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94139.log.gz 2026-03-25T15:45:45.161 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94182.log 2026-03-25T15:45:45.162 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94160.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94160.log.gz 2026-03-25T15:45:45.162 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94203.log 2026-03-25T15:45:45.162 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94182.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94182.log.gz 2026-03-25T15:45:45.163 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94225.log 2026-03-25T15:45:45.163 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94203.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94203.log.gz 2026-03-25T15:45:45.163 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94246.log 2026-03-25T15:45:45.164 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94225.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94225.log.gz 2026-03-25T15:45:45.164 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94268.log 2026-03-25T15:45:45.164 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94246.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94246.log.gz 2026-03-25T15:45:45.164 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94289.log 2026-03-25T15:45:45.165 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94268.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94268.log.gz 2026-03-25T15:45:45.165 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94311.log 2026-03-25T15:45:45.165 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94289.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94289.log.gz 2026-03-25T15:45:45.165 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94332.log 2026-03-25T15:45:45.166 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94311.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94311.log.gz 2026-03-25T15:45:45.166 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94354.log 2026-03-25T15:45:45.166 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94332.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94332.log.gz 2026-03-25T15:45:45.167 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94375.log 2026-03-25T15:45:45.167 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94354.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94354.log.gz 2026-03-25T15:45:45.167 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94397.log 2026-03-25T15:45:45.168 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94375.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94375.log.gz 2026-03-25T15:45:45.168 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94418.log 2026-03-25T15:45:45.168 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94397.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94397.log.gz 2026-03-25T15:45:45.168 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94440.log 2026-03-25T15:45:45.169 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94418.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94418.log.gz 2026-03-25T15:45:45.169 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94461.log 2026-03-25T15:45:45.169 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94440.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94440.log.gz 2026-03-25T15:45:45.169 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94483.log 2026-03-25T15:45:45.170 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94461.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94461.log.gz 2026-03-25T15:45:45.170 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94504.log 2026-03-25T15:45:45.170 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94483.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94483.log.gz 2026-03-25T15:45:45.171 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94526.log 2026-03-25T15:45:45.171 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94504.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94504.log.gz 2026-03-25T15:45:45.171 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94547.log 2026-03-25T15:45:45.171 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94526.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94526.log.gz 2026-03-25T15:45:45.172 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94569.log 2026-03-25T15:45:45.172 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94547.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94547.log.gz 2026-03-25T15:45:45.172 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94590.log 2026-03-25T15:45:45.173 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94569.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94569.log.gz 2026-03-25T15:45:45.173 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94612.log 2026-03-25T15:45:45.173 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94590.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94590.log.gz 2026-03-25T15:45:45.173 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94633.log 2026-03-25T15:45:45.174 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94612.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94612.log.gz 2026-03-25T15:45:45.174 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94655.log 2026-03-25T15:45:45.174 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94633.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94633.log.gz 2026-03-25T15:45:45.174 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94676.log 2026-03-25T15:45:45.175 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94655.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94655.log.gz 2026-03-25T15:45:45.175 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94698.log 2026-03-25T15:45:45.175 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94676.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94676.log.gz 2026-03-25T15:45:45.176 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94719.log 2026-03-25T15:45:45.176 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94698.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94698.log.gz 2026-03-25T15:45:45.176 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94741.log 2026-03-25T15:45:45.177 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94719.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94719.log.gz 2026-03-25T15:45:45.177 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94762.log 2026-03-25T15:45:45.177 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94741.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94741.log.gz 2026-03-25T15:45:45.177 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94784.log 2026-03-25T15:45:45.178 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94762.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94762.log.gz 2026-03-25T15:45:45.178 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94805.log 2026-03-25T15:45:45.178 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94784.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94784.log.gz 2026-03-25T15:45:45.179 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94827.log 2026-03-25T15:45:45.179 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94805.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94805.log.gz 2026-03-25T15:45:45.179 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94848.log 2026-03-25T15:45:45.180 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94827.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94827.log.gz 2026-03-25T15:45:45.180 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94870.log 2026-03-25T15:45:45.180 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94848.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94848.log.gz 2026-03-25T15:45:45.180 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94891.log 2026-03-25T15:45:45.181 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94870.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94870.log.gz 2026-03-25T15:45:45.181 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94913.log 2026-03-25T15:45:45.181 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94891.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94891.log.gz 2026-03-25T15:45:45.181 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94934.log 2026-03-25T15:45:45.182 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94913.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94913.log.gz 2026-03-25T15:45:45.182 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94956.log 2026-03-25T15:45:45.182 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94934.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94934.log.gz 2026-03-25T15:45:45.182 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94977.log 2026-03-25T15:45:45.183 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94956.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94956.log.gz 2026-03-25T15:45:45.183 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.94999.log 2026-03-25T15:45:45.183 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94977.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94977.log.gz 2026-03-25T15:45:45.184 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95020.log 2026-03-25T15:45:45.184 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.94999.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.94999.log.gz 2026-03-25T15:45:45.184 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95042.log 2026-03-25T15:45:45.184 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95020.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95020.log.gz 2026-03-25T15:45:45.185 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95063.log 2026-03-25T15:45:45.185 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95042.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95042.log.gz 2026-03-25T15:45:45.185 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95085.log 2026-03-25T15:45:45.186 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95063.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95063.log.gz 2026-03-25T15:45:45.186 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95106.log 2026-03-25T15:45:45.186 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95085.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95085.log.gz 2026-03-25T15:45:45.186 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95128.log 2026-03-25T15:45:45.187 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95106.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95106.log.gz 2026-03-25T15:45:45.187 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95149.log 2026-03-25T15:45:45.187 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95128.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95128.log.gz 2026-03-25T15:45:45.188 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95170.log 2026-03-25T15:45:45.188 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95149.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95149.log.gz 2026-03-25T15:45:45.188 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95189.log 2026-03-25T15:45:45.188 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95170.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95170.log.gz 2026-03-25T15:45:45.189 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95206.log 2026-03-25T15:45:45.189 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95189.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95189.log.gz 2026-03-25T15:45:45.189 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95223.log 2026-03-25T15:45:45.190 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95206.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95206.log.gz 2026-03-25T15:45:45.190 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95240.log 2026-03-25T15:45:45.190 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95223.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95223.log.gz 2026-03-25T15:45:45.190 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95261.log 2026-03-25T15:45:45.191 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95240.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95240.log.gz 2026-03-25T15:45:45.191 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95282.log 2026-03-25T15:45:45.191 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95261.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95261.log.gz 2026-03-25T15:45:45.191 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95301.log 2026-03-25T15:45:45.192 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95282.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95282.log.gz 2026-03-25T15:45:45.192 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95322.log 2026-03-25T15:45:45.192 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95301.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95301.log.gz 2026-03-25T15:45:45.193 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95341.log 2026-03-25T15:45:45.193 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95322.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95322.log.gz 2026-03-25T15:45:45.193 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95358.log 2026-03-25T15:45:45.193 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95341.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95341.log.gz 2026-03-25T15:45:45.194 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95375.log 2026-03-25T15:45:45.194 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95358.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95358.log.gz 2026-03-25T15:45:45.194 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95396.log 2026-03-25T15:45:45.195 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95375.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95375.log.gz 2026-03-25T15:45:45.195 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95417.log 2026-03-25T15:45:45.195 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95396.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95396.log.gz 2026-03-25T15:45:45.195 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95436.log 2026-03-25T15:45:45.196 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95417.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95417.log.gz 2026-03-25T15:45:45.196 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95453.log 2026-03-25T15:45:45.196 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95436.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95436.log.gz 2026-03-25T15:45:45.196 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95472.log 2026-03-25T15:45:45.197 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95453.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95453.log.gz 2026-03-25T15:45:45.197 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95494.log 2026-03-25T15:45:45.197 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95472.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95472.log.gz 2026-03-25T15:45:45.198 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95513.log 2026-03-25T15:45:45.198 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95494.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95494.log.gz 2026-03-25T15:45:45.199 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95531.log 2026-03-25T15:45:45.199 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95513.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95513.log.gz 2026-03-25T15:45:45.199 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95553.log 2026-03-25T15:45:45.200 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95531.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95531.log.gz 2026-03-25T15:45:45.200 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95572.log 2026-03-25T15:45:45.200 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95553.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95553.log.gz 2026-03-25T15:45:45.201 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95589.log 2026-03-25T15:45:45.201 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95572.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95572.log.gz 2026-03-25T15:45:45.201 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95609.log 2026-03-25T15:45:45.202 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95589.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95589.log.gz 2026-03-25T15:45:45.202 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95626.log 2026-03-25T15:45:45.202 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95609.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95609.log.gz 2026-03-25T15:45:45.203 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95643.log 2026-03-25T15:45:45.203 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95626.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95626.log.gz 2026-03-25T15:45:45.203 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95664.log 2026-03-25T15:45:45.203 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95643.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95643.log.gz 2026-03-25T15:45:45.204 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95685.log 2026-03-25T15:45:45.204 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95664.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95664.log.gz 2026-03-25T15:45:45.204 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95706.log 2026-03-25T15:45:45.205 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95685.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95685.log.gz 2026-03-25T15:45:45.205 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95725.log 2026-03-25T15:45:45.205 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95706.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95706.log.gz 2026-03-25T15:45:45.205 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95747.log 2026-03-25T15:45:45.206 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95725.log: 58.2% -- replaced with /var/log/ceph/ceph-client.admin.95725.log.gz 2026-03-25T15:45:45.206 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95766.log 2026-03-25T15:45:45.207 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95747.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95747.log.gz 2026-03-25T15:45:45.207 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95785.log 2026-03-25T15:45:45.207 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95766.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95766.log.gz 2026-03-25T15:45:45.207 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95803.log 2026-03-25T15:45:45.208 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95785.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95785.log.gz 2026-03-25T15:45:45.208 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95823.log 2026-03-25T15:45:45.208 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95803.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95803.log.gz 2026-03-25T15:45:45.209 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95844.log 2026-03-25T15:45:45.209 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95823.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95823.log.gz 2026-03-25T15:45:45.209 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95863.log 2026-03-25T15:45:45.210 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95844.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95844.log.gz 2026-03-25T15:45:45.210 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95880.log 2026-03-25T15:45:45.210 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95863.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95863.log.gz 2026-03-25T15:45:45.210 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95897.log 2026-03-25T15:45:45.211 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95880.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95880.log.gz 2026-03-25T15:45:45.211 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95917.log 2026-03-25T15:45:45.211 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95897.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95897.log.gz 2026-03-25T15:45:45.212 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95934.log 2026-03-25T15:45:45.212 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95917.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95917.log.gz 2026-03-25T15:45:45.212 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95955.log 2026-03-25T15:45:45.212 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95934.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95934.log.gz 2026-03-25T15:45:45.213 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95976.log 2026-03-25T15:45:45.213 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95955.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95955.log.gz 2026-03-25T15:45:45.213 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.95997.log 2026-03-25T15:45:45.214 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95976.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95976.log.gz 2026-03-25T15:45:45.214 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96016.log 2026-03-25T15:45:45.214 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.95997.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.95997.log.gz 2026-03-25T15:45:45.214 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96038.log 2026-03-25T15:45:45.215 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96016.log: 59.0% -- replaced with /var/log/ceph/ceph-client.admin.96016.log.gz 2026-03-25T15:45:45.215 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96057.log 2026-03-25T15:45:45.215 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96038.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96038.log.gz 2026-03-25T15:45:45.216 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96076.log 2026-03-25T15:45:45.216 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96057.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96057.log.gz 2026-03-25T15:45:45.216 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96094.log 2026-03-25T15:45:45.216 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96076.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96076.log.gz 2026-03-25T15:45:45.217 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96114.log 2026-03-25T15:45:45.217 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96094.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96094.log.gz 2026-03-25T15:45:45.217 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96135.log 2026-03-25T15:45:45.218 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96114.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96114.log.gz 2026-03-25T15:45:45.218 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96154.log 2026-03-25T15:45:45.218 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96135.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96135.log.gz 2026-03-25T15:45:45.219 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96171.log 2026-03-25T15:45:45.219 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96154.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96154.log.gz 2026-03-25T15:45:45.219 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96188.log 2026-03-25T15:45:45.220 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96171.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96171.log.gz 2026-03-25T15:45:45.220 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96205.log 2026-03-25T15:45:45.220 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96188.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96188.log.gz 2026-03-25T15:45:45.220 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96225.log 2026-03-25T15:45:45.221 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96205.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96205.log.gz 2026-03-25T15:45:45.221 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96246.log 2026-03-25T15:45:45.221 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96225.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96225.log.gz 2026-03-25T15:45:45.222 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96267.log 2026-03-25T15:45:45.222 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96246.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96246.log.gz 2026-03-25T15:45:45.222 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96288.log 2026-03-25T15:45:45.223 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96267.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96267.log.gz 2026-03-25T15:45:45.223 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96307.log 2026-03-25T15:45:45.223 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96288.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96288.log.gz 2026-03-25T15:45:45.223 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96329.log 2026-03-25T15:45:45.224 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96307.log: 55.0% -- replaced with /var/log/ceph/ceph-client.admin.96307.log.gz 2026-03-25T15:45:45.224 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96348.log 2026-03-25T15:45:45.224 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96329.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96329.log.gz 2026-03-25T15:45:45.225 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96367.log 2026-03-25T15:45:45.225 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96348.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96348.log.gz 2026-03-25T15:45:45.225 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96385.log 2026-03-25T15:45:45.226 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96367.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96367.log.gz 2026-03-25T15:45:45.226 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96405.log 2026-03-25T15:45:45.226 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96385.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96385.log.gz 2026-03-25T15:45:45.226 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96426.log 2026-03-25T15:45:45.227 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96405.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96405.log.gz 2026-03-25T15:45:45.227 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96445.log 2026-03-25T15:45:45.227 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96426.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96426.log.gz 2026-03-25T15:45:45.228 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96462.log 2026-03-25T15:45:45.228 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96445.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96445.log.gz 2026-03-25T15:45:45.228 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96482.log 2026-03-25T15:45:45.228 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96462.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96462.log.gz 2026-03-25T15:45:45.229 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96502.log 2026-03-25T15:45:45.229 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96482.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96482.log.gz 2026-03-25T15:45:45.229 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96522.log 2026-03-25T15:45:45.230 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96502.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96502.log.gz 2026-03-25T15:45:45.230 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96539.log 2026-03-25T15:45:45.230 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96522.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96522.log.gz 2026-03-25T15:45:45.230 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96559.log 2026-03-25T15:45:45.231 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96539.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96539.log.gz 2026-03-25T15:45:45.231 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96579.log 2026-03-25T15:45:45.231 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96559.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96559.log.gz 2026-03-25T15:45:45.232 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96599.log 2026-03-25T15:45:45.232 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96579.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96579.log.gz 2026-03-25T15:45:45.232 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96619.log 2026-03-25T15:45:45.232 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96599.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96599.log.gz 2026-03-25T15:45:45.233 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96639.log 2026-03-25T15:45:45.233 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96619.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96619.log.gz 2026-03-25T15:45:45.233 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96659.log 2026-03-25T15:45:45.234 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96639.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96639.log.gz 2026-03-25T15:45:45.234 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96679.log 2026-03-25T15:45:45.234 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96659.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96659.log.gz 2026-03-25T15:45:45.234 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96700.log 2026-03-25T15:45:45.235 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96679.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96679.log.gz 2026-03-25T15:45:45.235 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96721.log 2026-03-25T15:45:45.235 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96700.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96700.log.gz 2026-03-25T15:45:45.236 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96742.log 2026-03-25T15:45:45.236 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96721.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96721.log.gz 2026-03-25T15:45:45.236 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96763.log 2026-03-25T15:45:45.236 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96742.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96742.log.gz 2026-03-25T15:45:45.237 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96782.log 2026-03-25T15:45:45.237 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96763.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96763.log.gz 2026-03-25T15:45:45.237 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96804.log 2026-03-25T15:45:45.238 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96782.log: 58.2% -- replaced with /var/log/ceph/ceph-client.admin.96782.log.gz 2026-03-25T15:45:45.238 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96823.log 2026-03-25T15:45:45.238 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96804.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96804.log.gz 2026-03-25T15:45:45.238 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96841.log 2026-03-25T15:45:45.239 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96823.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96823.log.gz 2026-03-25T15:45:45.239 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96859.log 2026-03-25T15:45:45.239 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96841.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96841.log.gz 2026-03-25T15:45:45.239 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96877.log 2026-03-25T15:45:45.240 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96859.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96859.log.gz 2026-03-25T15:45:45.240 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96898.log 2026-03-25T15:45:45.240 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96877.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96877.log.gz 2026-03-25T15:45:45.241 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96917.log 2026-03-25T15:45:45.241 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96898.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96898.log.gz 2026-03-25T15:45:45.241 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96939.log 2026-03-25T15:45:45.241 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96917.log: 58.1% -- replaced with /var/log/ceph/ceph-client.admin.96917.log.gz 2026-03-25T15:45:45.242 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96958.log 2026-03-25T15:45:45.242 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96939.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96939.log.gz 2026-03-25T15:45:45.242 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96976.log 2026-03-25T15:45:45.243 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96958.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96958.log.gz 2026-03-25T15:45:45.243 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.96994.log 2026-03-25T15:45:45.243 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96976.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96976.log.gz 2026-03-25T15:45:45.243 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97015.log 2026-03-25T15:45:45.244 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.96994.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.96994.log.gz 2026-03-25T15:45:45.244 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97034.log 2026-03-25T15:45:45.244 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97015.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97015.log.gz 2026-03-25T15:45:45.244 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97055.log 2026-03-25T15:45:45.245 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97034.log: 58.2% -- replaced with /var/log/ceph/ceph-client.admin.97034.log.gz 2026-03-25T15:45:45.245 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97074.log 2026-03-25T15:45:45.245 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97055.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97055.log.gz 2026-03-25T15:45:45.246 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97091.log 2026-03-25T15:45:45.246 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97074.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97074.log.gz 2026-03-25T15:45:45.246 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97111.log 2026-03-25T15:45:45.246 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97091.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97091.log.gz 2026-03-25T15:45:45.247 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97131.log 2026-03-25T15:45:45.247 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97111.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97111.log.gz 2026-03-25T15:45:45.247 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97151.log 2026-03-25T15:45:45.248 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97131.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97131.log.gz 2026-03-25T15:45:45.248 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97168.log 2026-03-25T15:45:45.248 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97151.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97151.log.gz 2026-03-25T15:45:45.248 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97188.log 2026-03-25T15:45:45.249 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97168.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97168.log.gz 2026-03-25T15:45:45.249 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97208.log 2026-03-25T15:45:45.249 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97188.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97188.log.gz 2026-03-25T15:45:45.249 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97228.log 2026-03-25T15:45:45.250 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97208.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97208.log.gz 2026-03-25T15:45:45.250 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97248.log 2026-03-25T15:45:45.250 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97228.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97228.log.gz 2026-03-25T15:45:45.251 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97268.log 2026-03-25T15:45:45.251 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97248.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97248.log.gz 2026-03-25T15:45:45.251 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97288.log 2026-03-25T15:45:45.251 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97268.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97268.log.gz 2026-03-25T15:45:45.252 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97308.log 2026-03-25T15:45:45.252 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97288.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97288.log.gz 2026-03-25T15:45:45.252 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97328.log 2026-03-25T15:45:45.253 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97308.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97308.log.gz 2026-03-25T15:45:45.253 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97349.log 2026-03-25T15:45:45.253 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97328.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97328.log.gz 2026-03-25T15:45:45.253 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97370.log 2026-03-25T15:45:45.254 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97349.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97349.log.gz 2026-03-25T15:45:45.254 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97391.log 2026-03-25T15:45:45.254 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97370.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97370.log.gz 2026-03-25T15:45:45.254 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97412.log 2026-03-25T15:45:45.255 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97391.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97391.log.gz 2026-03-25T15:45:45.255 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97431.log 2026-03-25T15:45:45.255 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97412.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97412.log.gz 2026-03-25T15:45:45.256 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97452.log 2026-03-25T15:45:45.256 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97431.log: 59.0% -- replaced with /var/log/ceph/ceph-client.admin.97431.log.gz 2026-03-25T15:45:45.256 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97471.log 2026-03-25T15:45:45.257 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97452.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97452.log.gz 2026-03-25T15:45:45.257 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97492.log 2026-03-25T15:45:45.257 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97471.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97471.log.gz 2026-03-25T15:45:45.258 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97511.log 2026-03-25T15:45:45.258 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97492.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97492.log.gz 2026-03-25T15:45:45.258 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97533.log 2026-03-25T15:45:45.258 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97511.log: 52.8% -- replaced with /var/log/ceph/ceph-client.admin.97511.log.gz 2026-03-25T15:45:45.259 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97552.log 2026-03-25T15:45:45.259 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97533.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97533.log.gz 2026-03-25T15:45:45.259 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97570.log 2026-03-25T15:45:45.259 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97552.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97552.log.gz 2026-03-25T15:45:45.260 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97588.log 2026-03-25T15:45:45.260 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97570.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97570.log.gz 2026-03-25T15:45:45.260 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97606.log 2026-03-25T15:45:45.261 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97588.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97588.log.gz 2026-03-25T15:45:45.261 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97627.log 2026-03-25T15:45:45.261 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97606.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97606.log.gz 2026-03-25T15:45:45.261 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97646.log 2026-03-25T15:45:45.262 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97627.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97627.log.gz 2026-03-25T15:45:45.262 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97668.log 2026-03-25T15:45:45.262 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97646.log: 57.9% -- replaced with /var/log/ceph/ceph-client.admin.97646.log.gz 2026-03-25T15:45:45.263 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97687.log 2026-03-25T15:45:45.263 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97668.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97668.log.gz 2026-03-25T15:45:45.263 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97706.log 2026-03-25T15:45:45.263 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97687.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97687.log.gz 2026-03-25T15:45:45.264 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97724.log 2026-03-25T15:45:45.264 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97706.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97706.log.gz 2026-03-25T15:45:45.264 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97744.log 2026-03-25T15:45:45.264 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97724.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97724.log.gz 2026-03-25T15:45:45.265 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97765.log 2026-03-25T15:45:45.265 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97744.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97744.log.gz 2026-03-25T15:45:45.265 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97784.log 2026-03-25T15:45:45.266 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97765.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97765.log.gz 2026-03-25T15:45:45.266 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97801.log 2026-03-25T15:45:45.266 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97784.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97784.log.gz 2026-03-25T15:45:45.266 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97821.log 2026-03-25T15:45:45.267 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97801.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97801.log.gz 2026-03-25T15:45:45.267 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97841.log 2026-03-25T15:45:45.267 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97821.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97821.log.gz 2026-03-25T15:45:45.267 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97861.log 2026-03-25T15:45:45.268 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97841.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97841.log.gz 2026-03-25T15:45:45.268 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97878.log 2026-03-25T15:45:45.268 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97861.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97861.log.gz 2026-03-25T15:45:45.269 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97898.log 2026-03-25T15:45:45.269 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97878.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97878.log.gz 2026-03-25T15:45:45.269 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97918.log 2026-03-25T15:45:45.270 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97898.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97898.log.gz 2026-03-25T15:45:45.270 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97938.log 2026-03-25T15:45:45.270 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97918.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97918.log.gz 2026-03-25T15:45:45.270 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97958.log 2026-03-25T15:45:45.271 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97938.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97938.log.gz 2026-03-25T15:45:45.271 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97978.log 2026-03-25T15:45:45.271 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97958.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97958.log.gz 2026-03-25T15:45:45.271 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.97998.log 2026-03-25T15:45:45.272 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97978.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97978.log.gz 2026-03-25T15:45:45.272 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98018.log 2026-03-25T15:45:45.272 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.97998.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.97998.log.gz 2026-03-25T15:45:45.272 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98038.log 2026-03-25T15:45:45.273 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98018.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.98018.log.gz 2026-03-25T15:45:45.273 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98058.log 2026-03-25T15:45:45.273 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98038.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.98038.log.gz 2026-03-25T15:45:45.274 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98079.log 2026-03-25T15:45:45.274 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98058.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.98058.log.gz 2026-03-25T15:45:45.274 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98099.log 2026-03-25T15:45:45.274 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98079.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.98079.log.gz 2026-03-25T15:45:45.275 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98118.log 2026-03-25T15:45:45.275 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98099.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.98099.log.gz 2026-03-25T15:45:45.275 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98140.log 2026-03-25T15:45:45.276 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98118.log: 58.6% -- replaced with /var/log/ceph/ceph-client.admin.98118.log.gz 2026-03-25T15:45:45.276 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98159.log 2026-03-25T15:45:45.276 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98140.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.98140.log.gz 2026-03-25T15:45:45.277 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98177.log 2026-03-25T15:45:45.277 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98159.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.98159.log.gz 2026-03-25T15:45:45.277 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98195.log 2026-03-25T15:45:45.277 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98177.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.98177.log.gz 2026-03-25T15:45:45.278 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98213.log 2026-03-25T15:45:45.278 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98195.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.98195.log.gz 2026-03-25T15:45:45.278 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98234.log 2026-03-25T15:45:45.278 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98213.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.98213.log.gz 2026-03-25T15:45:45.279 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98253.log 2026-03-25T15:45:45.279 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98234.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.98234.log.gz 2026-03-25T15:45:45.279 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98275.log 2026-03-25T15:45:45.280 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98253.log: 58.1% -- replaced with /var/log/ceph/ceph-client.admin.98253.log.gz 2026-03-25T15:45:45.280 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98294.log 2026-03-25T15:45:45.280 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98275.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.98275.log.gz 2026-03-25T15:45:45.280 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98312.log 2026-03-25T15:45:45.281 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98294.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.98294.log.gz 2026-03-25T15:45:45.281 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98330.log 2026-03-25T15:45:45.281 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98312.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.98312.log.gz 2026-03-25T15:45:45.282 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98351.log 2026-03-25T15:45:45.282 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98330.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.98330.log.gz 2026-03-25T15:45:45.282 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98370.log 2026-03-25T15:45:45.282 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98351.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.98351.log.gz 2026-03-25T15:45:45.283 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98391.log 2026-03-25T15:45:45.283 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98370.log: 56.5% -- replaced with /var/log/ceph/ceph-client.admin.98370.log.gz 2026-03-25T15:45:45.283 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98410.log 2026-03-25T15:45:45.284 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98391.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.98391.log.gz 2026-03-25T15:45:45.284 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98427.log 2026-03-25T15:45:45.284 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98410.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.98410.log.gz 2026-03-25T15:45:45.284 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98447.log 2026-03-25T15:45:45.285 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98427.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.98427.log.gz 2026-03-25T15:45:45.285 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98467.log 2026-03-25T15:45:45.285 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98447.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.98447.log.gz 2026-03-25T15:45:45.286 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98487.log 2026-03-25T15:45:45.286 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98467.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.98467.log.gz 2026-03-25T15:45:45.286 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98504.log 2026-03-25T15:45:45.286 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98487.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.98487.log.gz 2026-03-25T15:45:45.287 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98524.log 2026-03-25T15:45:45.287 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98504.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.98504.log.gz 2026-03-25T15:45:45.288 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98544.log 2026-03-25T15:45:45.288 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98524.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.98524.log.gz 2026-03-25T15:45:45.288 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98564.log 2026-03-25T15:45:45.289 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98544.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.98544.log.gz 2026-03-25T15:45:45.289 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98584.log 2026-03-25T15:45:45.289 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98564.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.98564.log.gz 2026-03-25T15:45:45.290 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98604.log 2026-03-25T15:45:45.290 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98584.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.98584.log.gz 2026-03-25T15:45:45.290 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98624.log 2026-03-25T15:45:45.291 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98604.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.98604.log.gz 2026-03-25T15:45:45.291 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98644.log 2026-03-25T15:45:45.291 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98624.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.98624.log.gz 2026-03-25T15:45:45.292 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98664.log 2026-03-25T15:45:45.292 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98644.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.98644.log.gz 2026-03-25T15:45:45.292 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98684.log 2026-03-25T15:45:45.293 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98664.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.98664.log.gz 2026-03-25T15:45:45.293 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98704.log 2026-03-25T15:45:45.294 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98684.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.98684.log.gz 2026-03-25T15:45:45.294 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98725.log 2026-03-25T15:45:45.294 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98704.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.98704.log.gz 2026-03-25T15:45:45.295 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98745.log 2026-03-25T15:45:45.295 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98725.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.98725.log.gz 2026-03-25T15:45:45.295 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98764.log 2026-03-25T15:45:45.296 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98745.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.98745.log.gz 2026-03-25T15:45:45.296 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98785.log 2026-03-25T15:45:45.296 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98764.log: 58.4% -- replaced with /var/log/ceph/ceph-client.admin.98764.log.gz 2026-03-25T15:45:45.297 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98804.log 2026-03-25T15:45:45.297 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98785.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.98785.log.gz 2026-03-25T15:45:45.298 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98825.log 2026-03-25T15:45:45.298 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98804.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.98804.log.gz 2026-03-25T15:45:45.298 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98844.log 2026-03-25T15:45:45.299 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98825.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.98825.log.gz 2026-03-25T15:45:45.299 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98866.log 2026-03-25T15:45:45.299 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98844.log: 58.2% -- replaced with /var/log/ceph/ceph-client.admin.98844.log.gz 2026-03-25T15:45:45.300 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98885.log 2026-03-25T15:45:45.300 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98866.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.98866.log.gz 2026-03-25T15:45:45.301 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98903.log 2026-03-25T15:45:45.301 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98885.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.98885.log.gz 2026-03-25T15:45:45.301 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98921.log 2026-03-25T15:45:45.302 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98903.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.98903.log.gz 2026-03-25T15:45:45.302 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98939.log 2026-03-25T15:45:45.302 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98921.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.98921.log.gz 2026-03-25T15:45:45.303 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98960.log 2026-03-25T15:45:45.303 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98939.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.98939.log.gz 2026-03-25T15:45:45.303 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.98979.log 2026-03-25T15:45:45.303 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98960.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.98960.log.gz 2026-03-25T15:45:45.304 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99001.log 2026-03-25T15:45:45.304 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.98979.log: 54.8% -- replaced with /var/log/ceph/ceph-client.admin.98979.log.gz 2026-03-25T15:45:45.304 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99020.log 2026-03-25T15:45:45.305 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99001.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99001.log.gz 2026-03-25T15:45:45.305 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99039.log 2026-03-25T15:45:45.305 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99020.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99020.log.gz 2026-03-25T15:45:45.305 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99057.log 2026-03-25T15:45:45.306 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99039.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99039.log.gz 2026-03-25T15:45:45.306 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99077.log 2026-03-25T15:45:45.306 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99057.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99057.log.gz 2026-03-25T15:45:45.307 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99098.log 2026-03-25T15:45:45.307 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99077.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99077.log.gz 2026-03-25T15:45:45.307 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99118.log 2026-03-25T15:45:45.307 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99098.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99098.log.gz 2026-03-25T15:45:45.308 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99139.log 2026-03-25T15:45:45.308 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99118.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99118.log.gz 2026-03-25T15:45:45.308 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99161.log 2026-03-25T15:45:45.309 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99139.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99139.log.gz 2026-03-25T15:45:45.309 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99182.log 2026-03-25T15:45:45.309 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99161.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99161.log.gz 2026-03-25T15:45:45.309 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99204.log 2026-03-25T15:45:45.310 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99182.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99182.log.gz 2026-03-25T15:45:45.310 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99225.log 2026-03-25T15:45:45.310 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99204.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99204.log.gz 2026-03-25T15:45:45.311 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99247.log 2026-03-25T15:45:45.311 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99225.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99225.log.gz 2026-03-25T15:45:45.311 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99268.log 2026-03-25T15:45:45.311 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99247.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99247.log.gz 2026-03-25T15:45:45.312 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99290.log 2026-03-25T15:45:45.312 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99268.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99268.log.gz 2026-03-25T15:45:45.312 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99311.log 2026-03-25T15:45:45.313 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99290.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99290.log.gz 2026-03-25T15:45:45.313 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99333.log 2026-03-25T15:45:45.313 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99311.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99311.log.gz 2026-03-25T15:45:45.313 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99354.log 2026-03-25T15:45:45.314 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99333.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99333.log.gz 2026-03-25T15:45:45.314 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99376.log 2026-03-25T15:45:45.314 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99354.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99354.log.gz 2026-03-25T15:45:45.314 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99397.log 2026-03-25T15:45:45.315 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99376.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99376.log.gz 2026-03-25T15:45:45.315 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99419.log 2026-03-25T15:45:45.315 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99397.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99397.log.gz 2026-03-25T15:45:45.316 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99440.log 2026-03-25T15:45:45.316 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99419.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99419.log.gz 2026-03-25T15:45:45.316 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99462.log 2026-03-25T15:45:45.317 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99440.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99440.log.gz 2026-03-25T15:45:45.317 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99483.log 2026-03-25T15:45:45.317 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99462.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99462.log.gz 2026-03-25T15:45:45.317 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99505.log 2026-03-25T15:45:45.318 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99483.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99483.log.gz 2026-03-25T15:45:45.318 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99526.log 2026-03-25T15:45:45.319 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99505.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99505.log.gz 2026-03-25T15:45:45.319 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99548.log 2026-03-25T15:45:45.319 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99526.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99526.log.gz 2026-03-25T15:45:45.320 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99569.log 2026-03-25T15:45:45.320 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99548.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99548.log.gz 2026-03-25T15:45:45.320 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99591.log 2026-03-25T15:45:45.321 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99569.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99569.log.gz 2026-03-25T15:45:45.321 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99612.log 2026-03-25T15:45:45.321 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99591.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99591.log.gz 2026-03-25T15:45:45.322 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99634.log 2026-03-25T15:45:45.322 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99612.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99612.log.gz 2026-03-25T15:45:45.322 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99655.log 2026-03-25T15:45:45.323 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99634.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99634.log.gz 2026-03-25T15:45:45.323 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99677.log 2026-03-25T15:45:45.323 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99655.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99655.log.gz 2026-03-25T15:45:45.323 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99698.log 2026-03-25T15:45:45.324 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99677.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99677.log.gz 2026-03-25T15:45:45.324 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99720.log 2026-03-25T15:45:45.324 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99698.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99698.log.gz 2026-03-25T15:45:45.325 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99741.log 2026-03-25T15:45:45.325 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99720.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99720.log.gz 2026-03-25T15:45:45.325 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99763.log 2026-03-25T15:45:45.326 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99741.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99741.log.gz 2026-03-25T15:45:45.326 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99784.log 2026-03-25T15:45:45.326 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99763.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99763.log.gz 2026-03-25T15:45:45.327 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99806.log 2026-03-25T15:45:45.327 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99784.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99784.log.gz 2026-03-25T15:45:45.327 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99827.log 2026-03-25T15:45:45.328 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99806.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99806.log.gz 2026-03-25T15:45:45.328 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99849.log 2026-03-25T15:45:45.328 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99827.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99827.log.gz 2026-03-25T15:45:45.328 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99870.log 2026-03-25T15:45:45.329 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99849.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99849.log.gz 2026-03-25T15:45:45.329 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99891.log 2026-03-25T15:45:45.329 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99870.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99870.log.gz 2026-03-25T15:45:45.330 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99908.log 2026-03-25T15:45:45.330 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99891.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99891.log.gz 2026-03-25T15:45:45.330 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99928.log 2026-03-25T15:45:45.331 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99908.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99908.log.gz 2026-03-25T15:45:45.331 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99948.log 2026-03-25T15:45:45.331 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99928.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99928.log.gz 2026-03-25T15:45:45.332 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99968.log 2026-03-25T15:45:45.332 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99948.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99948.log.gz 2026-03-25T15:45:45.332 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.99988.log 2026-03-25T15:45:45.333 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99968.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.99968.log.gz 2026-03-25T15:45:45.333 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100008.log 2026-03-25T15:45:45.333 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.99988.log: 57.5% -- replaced with /var/log/ceph/ceph-client.admin.99988.log.gz 2026-03-25T15:45:45.333 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100029.log 2026-03-25T15:45:45.334 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100008.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.100008.log.gz 2026-03-25T15:45:45.334 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100050.log 2026-03-25T15:45:45.334 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100029.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.100029.log.gz 2026-03-25T15:45:45.335 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100073.log 2026-03-25T15:45:45.335 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100050.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.100050.log.gz 2026-03-25T15:45:45.335 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100094.log 2026-03-25T15:45:45.335 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100073.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.100073.log.gz 2026-03-25T15:45:45.336 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100115.log 2026-03-25T15:45:45.336 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100094.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.100094.log.gz 2026-03-25T15:45:45.336 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100136.log 2026-03-25T15:45:45.337 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100115.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.100115.log.gz 2026-03-25T15:45:45.337 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100156.log 2026-03-25T15:45:45.337 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100136.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.100136.log.gz 2026-03-25T15:45:45.337 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100176.log 2026-03-25T15:45:45.338 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100156.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.100156.log.gz 2026-03-25T15:45:45.338 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100196.log 2026-03-25T15:45:45.338 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100176.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.100176.log.gz 2026-03-25T15:45:45.339 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100216.log 2026-03-25T15:45:45.339 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100196.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.100196.log.gz 2026-03-25T15:45:45.339 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100236.log 2026-03-25T15:45:45.340 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100216.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.100216.log.gz 2026-03-25T15:45:45.340 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100257.log 2026-03-25T15:45:45.340 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100236.log: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.100236.log.gz 2026-03-25T15:45:45.341 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100278.log 2026-03-25T15:45:45.341 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100257.log: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.100257.log.gz 2026-03-25T15:45:45.341 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100298.log 2026-03-25T15:45:45.342 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100278.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.100278.log.gz 2026-03-25T15:45:45.342 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100318.log 2026-03-25T15:45:45.342 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100298.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.100298.log.gz 2026-03-25T15:45:45.342 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100338.log 2026-03-25T15:45:45.343 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100318.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.100318.log.gz 2026-03-25T15:45:45.343 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100358.log 2026-03-25T15:45:45.343 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100338.log: 57.9% -- replaced with /var/log/ceph/ceph-client.admin.100338.log.gz 2026-03-25T15:45:45.344 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100379.log 2026-03-25T15:45:45.344 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100358.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.100358.log.gz 2026-03-25T15:45:45.344 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100400.log 2026-03-25T15:45:45.345 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100379.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.100379.log.gz 2026-03-25T15:45:45.345 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100423.log 2026-03-25T15:45:45.345 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100400.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.100400.log.gz 2026-03-25T15:45:45.345 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100444.log 2026-03-25T15:45:45.346 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100423.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.100423.log.gz 2026-03-25T15:45:45.346 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100465.log 2026-03-25T15:45:45.346 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100444.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.100444.log.gz 2026-03-25T15:45:45.347 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100485.log 2026-03-25T15:45:45.347 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100465.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.100465.log.gz 2026-03-25T15:45:45.347 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100506.log 2026-03-25T15:45:45.347 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100485.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.100485.log.gz 2026-03-25T15:45:45.348 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100527.log 2026-03-25T15:45:45.348 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100506.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.100506.log.gz 2026-03-25T15:45:45.348 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100549.log 2026-03-25T15:45:45.349 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100527.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.100527.log.gz 2026-03-25T15:45:45.349 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100570.log 2026-03-25T15:45:45.349 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100549.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.100549.log.gz 2026-03-25T15:45:45.349 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100592.log 2026-03-25T15:45:45.350 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100570.log: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.100570.log.gz 2026-03-25T15:45:45.350 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100613.log 2026-03-25T15:45:45.350 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100592.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.100592.log.gz 2026-03-25T15:45:45.351 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100635.log 2026-03-25T15:45:45.351 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100613.log: 58.1% -- replaced with /var/log/ceph/ceph-client.admin.100613.log.gz 2026-03-25T15:45:45.351 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100656.log 2026-03-25T15:45:45.352 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100635.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.100635.log.gz 2026-03-25T15:45:45.352 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100678.log 2026-03-25T15:45:45.352 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100656.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.100656.log.gz 2026-03-25T15:45:45.352 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100699.log 2026-03-25T15:45:45.353 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100678.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.100678.log.gz 2026-03-25T15:45:45.353 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100721.log 2026-03-25T15:45:45.353 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100699.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.100699.log.gz 2026-03-25T15:45:45.354 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100742.log 2026-03-25T15:45:45.354 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100721.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.100721.log.gz 2026-03-25T15:45:45.354 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100764.log 2026-03-25T15:45:45.355 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100742.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.100742.log.gz 2026-03-25T15:45:45.355 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100785.log 2026-03-25T15:45:45.355 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100764.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.100764.log.gz 2026-03-25T15:45:45.355 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100807.log 2026-03-25T15:45:45.356 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100785.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.100785.log.gz 2026-03-25T15:45:45.356 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100828.log 2026-03-25T15:45:45.356 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100807.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.100807.log.gz 2026-03-25T15:45:45.356 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100850.log 2026-03-25T15:45:45.357 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100828.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.100828.log.gz 2026-03-25T15:45:45.357 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100871.log 2026-03-25T15:45:45.357 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100850.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.100850.log.gz 2026-03-25T15:45:45.358 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100893.log 2026-03-25T15:45:45.358 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100871.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.100871.log.gz 2026-03-25T15:45:45.358 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100914.log 2026-03-25T15:45:45.359 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100893.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.100893.log.gz 2026-03-25T15:45:45.359 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100936.log 2026-03-25T15:45:45.359 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100914.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.100914.log.gz 2026-03-25T15:45:45.360 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100957.log 2026-03-25T15:45:45.360 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100936.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.100936.log.gz 2026-03-25T15:45:45.360 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.100979.log 2026-03-25T15:45:45.360 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100957.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.100957.log.gz 2026-03-25T15:45:45.361 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101000.log 2026-03-25T15:45:45.361 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.100979.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.100979.log.gz 2026-03-25T15:45:45.361 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101022.log 2026-03-25T15:45:45.362 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101000.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101000.log.gz 2026-03-25T15:45:45.362 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101043.log 2026-03-25T15:45:45.362 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101022.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101022.log.gz 2026-03-25T15:45:45.362 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101065.log 2026-03-25T15:45:45.363 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101043.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101043.log.gz 2026-03-25T15:45:45.363 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101086.log 2026-03-25T15:45:45.363 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101065.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101065.log.gz 2026-03-25T15:45:45.364 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101108.log 2026-03-25T15:45:45.364 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101086.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101086.log.gz 2026-03-25T15:45:45.364 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101129.log 2026-03-25T15:45:45.365 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101108.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101108.log.gz 2026-03-25T15:45:45.365 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101151.log 2026-03-25T15:45:45.365 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101129.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101129.log.gz 2026-03-25T15:45:45.365 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101172.log 2026-03-25T15:45:45.366 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101151.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101151.log.gz 2026-03-25T15:45:45.366 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101194.log 2026-03-25T15:45:45.366 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101172.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101172.log.gz 2026-03-25T15:45:45.367 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101215.log 2026-03-25T15:45:45.367 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101194.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101194.log.gz 2026-03-25T15:45:45.367 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101237.log 2026-03-25T15:45:45.367 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101215.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101215.log.gz 2026-03-25T15:45:45.368 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101258.log 2026-03-25T15:45:45.368 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101237.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101237.log.gz 2026-03-25T15:45:45.368 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101280.log 2026-03-25T15:45:45.369 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101258.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101258.log.gz 2026-03-25T15:45:45.369 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101301.log 2026-03-25T15:45:45.369 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101280.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101280.log.gz 2026-03-25T15:45:45.370 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101323.log 2026-03-25T15:45:45.370 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101301.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101301.log.gz 2026-03-25T15:45:45.370 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101344.log 2026-03-25T15:45:45.370 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101323.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101323.log.gz 2026-03-25T15:45:45.371 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101366.log 2026-03-25T15:45:45.371 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101344.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101344.log.gz 2026-03-25T15:45:45.371 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101387.log 2026-03-25T15:45:45.372 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101366.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101366.log.gz 2026-03-25T15:45:45.372 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101409.log 2026-03-25T15:45:45.372 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101387.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101387.log.gz 2026-03-25T15:45:45.372 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101430.log 2026-03-25T15:45:45.373 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101409.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101409.log.gz 2026-03-25T15:45:45.373 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101452.log 2026-03-25T15:45:45.373 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101430.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101430.log.gz 2026-03-25T15:45:45.374 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101473.log 2026-03-25T15:45:45.374 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101452.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101452.log.gz 2026-03-25T15:45:45.374 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101495.log 2026-03-25T15:45:45.375 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101473.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101473.log.gz 2026-03-25T15:45:45.375 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101516.log 2026-03-25T15:45:45.375 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101495.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101495.log.gz 2026-03-25T15:45:45.375 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101538.log 2026-03-25T15:45:45.376 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101516.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101516.log.gz 2026-03-25T15:45:45.376 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101559.log 2026-03-25T15:45:45.376 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101538.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101538.log.gz 2026-03-25T15:45:45.376 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101581.log 2026-03-25T15:45:45.377 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101559.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101559.log.gz 2026-03-25T15:45:45.377 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101602.log 2026-03-25T15:45:45.377 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101581.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101581.log.gz 2026-03-25T15:45:45.378 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101624.log 2026-03-25T15:45:45.378 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101602.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101602.log.gz 2026-03-25T15:45:45.378 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101645.log 2026-03-25T15:45:45.379 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101624.log: 29.3% -- replaced with /var/log/ceph/ceph-client.admin.101624.log.gz 2026-03-25T15:45:45.379 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101667.log 2026-03-25T15:45:45.379 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101645.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101645.log.gz 2026-03-25T15:45:45.380 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101688.log 2026-03-25T15:45:45.380 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101667.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101667.log.gz 2026-03-25T15:45:45.380 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101710.log 2026-03-25T15:45:45.380 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101688.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101688.log.gz 2026-03-25T15:45:45.381 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101731.log 2026-03-25T15:45:45.381 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101710.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101710.log.gz 2026-03-25T15:45:45.381 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101753.log 2026-03-25T15:45:45.382 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101731.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101731.log.gz 2026-03-25T15:45:45.382 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101774.log 2026-03-25T15:45:45.382 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101753.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101753.log.gz 2026-03-25T15:45:45.382 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101796.log 2026-03-25T15:45:45.383 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101774.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101774.log.gz 2026-03-25T15:45:45.383 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101817.log 2026-03-25T15:45:45.383 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101796.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101796.log.gz 2026-03-25T15:45:45.384 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101839.log 2026-03-25T15:45:45.384 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101817.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101817.log.gz 2026-03-25T15:45:45.384 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101860.log 2026-03-25T15:45:45.384 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101839.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101839.log.gz 2026-03-25T15:45:45.385 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101882.log 2026-03-25T15:45:45.385 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101860.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101860.log.gz 2026-03-25T15:45:45.385 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101903.log 2026-03-25T15:45:45.386 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101882.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101882.log.gz 2026-03-25T15:45:45.386 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101925.log 2026-03-25T15:45:45.386 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101903.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101903.log.gz 2026-03-25T15:45:45.386 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101946.log 2026-03-25T15:45:45.387 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101925.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101925.log.gz 2026-03-25T15:45:45.387 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101968.log 2026-03-25T15:45:45.387 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101946.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101946.log.gz 2026-03-25T15:45:45.388 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.101989.log 2026-03-25T15:45:45.388 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101968.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101968.log.gz 2026-03-25T15:45:45.388 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102011.log 2026-03-25T15:45:45.388 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.101989.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.101989.log.gz 2026-03-25T15:45:45.389 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102032.log 2026-03-25T15:45:45.389 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102011.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.102011.log.gz 2026-03-25T15:45:45.389 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102053.log 2026-03-25T15:45:45.390 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102032.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.102032.log.gz 2026-03-25T15:45:45.390 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102070.log 2026-03-25T15:45:45.390 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102053.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.102053.log.gz 2026-03-25T15:45:45.390 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102090.log 2026-03-25T15:45:45.391 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102070.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.102070.log.gz 2026-03-25T15:45:45.391 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102110.log 2026-03-25T15:45:45.391 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102090.log: 12.9% -- replaced with /var/log/ceph/ceph-client.admin.102090.log.gz 2026-03-25T15:45:45.392 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102131.log 2026-03-25T15:45:45.392 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102110.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.102110.log.gz 2026-03-25T15:45:45.392 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102152.log 2026-03-25T15:45:45.392 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102131.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.102131.log.gz 2026-03-25T15:45:45.393 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102172.log 2026-03-25T15:45:45.393 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102152.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.102152.log.gz 2026-03-25T15:45:45.393 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102192.log 2026-03-25T15:45:45.394 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102172.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.102172.log.gz 2026-03-25T15:45:45.394 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102212.log 2026-03-25T15:45:45.394 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102192.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.102192.log.gz 2026-03-25T15:45:45.394 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102235.log 2026-03-25T15:45:45.395 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102212.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.102212.log.gz 2026-03-25T15:45:45.395 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102258.log 2026-03-25T15:45:45.395 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102235.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.102235.log.gz 2026-03-25T15:45:45.396 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102279.log 2026-03-25T15:45:45.396 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102258.log: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.102258.log.gz 2026-03-25T15:45:45.396 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102299.log 2026-03-25T15:45:45.397 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102279.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.102279.log.gz 2026-03-25T15:45:45.397 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102319.log 2026-03-25T15:45:45.397 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102299.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.102299.log.gz 2026-03-25T15:45:45.397 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102340.log 2026-03-25T15:45:45.398 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102319.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.102319.log.gz 2026-03-25T15:45:45.398 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102360.log 2026-03-25T15:45:45.398 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102340.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.102340.log.gz 2026-03-25T15:45:45.398 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102381.log 2026-03-25T15:45:45.399 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102360.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.102360.log.gz 2026-03-25T15:45:45.399 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102401.log 2026-03-25T15:45:45.400 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102381.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.102381.log.gz 2026-03-25T15:45:45.400 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102422.log 2026-03-25T15:45:45.400 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102401.log: 12.0% -- replaced with /var/log/ceph/ceph-client.admin.102401.log.gz 2026-03-25T15:45:45.400 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102443.log 2026-03-25T15:45:45.401 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102422.log: 25.4% -- replaced with /var/log/ceph/ceph-client.admin.102422.log.gz 2026-03-25T15:45:45.401 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102464.log 2026-03-25T15:45:45.401 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102443.log: 12.0% -- replaced with /var/log/ceph/ceph-client.admin.102443.log.gz 2026-03-25T15:45:45.402 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102484.log 2026-03-25T15:45:45.402 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102464.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.102464.log.gz 2026-03-25T15:45:45.402 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102506.log 2026-03-25T15:45:45.402 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102484.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.102484.log.gz 2026-03-25T15:45:45.403 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102527.log 2026-03-25T15:45:45.403 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102506.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.102506.log.gz 2026-03-25T15:45:45.403 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102548.log 2026-03-25T15:45:45.404 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102527.log: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.102527.log.gz 2026-03-25T15:45:45.404 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102565.log 2026-03-25T15:45:45.404 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102548.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.102548.log.gz 2026-03-25T15:45:45.404 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102585.log 2026-03-25T15:45:45.405 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102565.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.102565.log.gz 2026-03-25T15:45:45.405 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102605.log 2026-03-25T15:45:45.405 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102585.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.102585.log.gz 2026-03-25T15:45:45.406 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102625.log 2026-03-25T15:45:45.406 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102605.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.102605.log.gz 2026-03-25T15:45:45.406 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102645.log 2026-03-25T15:45:45.407 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102625.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.102625.log.gz 2026-03-25T15:45:45.407 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102665.log 2026-03-25T15:45:45.407 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102645.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.102645.log.gz 2026-03-25T15:45:45.407 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102685.log 2026-03-25T15:45:45.408 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102665.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.102665.log.gz 2026-03-25T15:45:45.408 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102705.log 2026-03-25T15:45:45.408 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102685.log: 11.1% -- replaced with /var/log/ceph/ceph-client.admin.102685.log.gz 2026-03-25T15:45:45.408 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102725.log 2026-03-25T15:45:45.409 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102705.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.102705.log.gz 2026-03-25T15:45:45.409 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102743.log 2026-03-25T15:45:45.409 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102725.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.102725.log.gz 2026-03-25T15:45:45.410 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102764.log 2026-03-25T15:45:45.410 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102743.log: 58.2% -- replaced with /var/log/ceph/ceph-client.admin.102743.log.gz 2026-03-25T15:45:45.410 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102782.log 2026-03-25T15:45:45.411 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102764.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.102764.log.gz 2026-03-25T15:45:45.411 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102803.log 2026-03-25T15:45:45.411 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102782.log: 52.9% -- replaced with /var/log/ceph/ceph-client.admin.102782.log.gz 2026-03-25T15:45:45.411 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102823.log 2026-03-25T15:45:45.412 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102803.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.102803.log.gz 2026-03-25T15:45:45.412 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102844.log 2026-03-25T15:45:45.412 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102823.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.102823.log.gz 2026-03-25T15:45:45.412 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102866.log 2026-03-25T15:45:45.413 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102844.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.102844.log.gz 2026-03-25T15:45:45.413 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102887.log 2026-03-25T15:45:45.413 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102866.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.102866.log.gz 2026-03-25T15:45:45.414 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102909.log 2026-03-25T15:45:45.414 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102887.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.102887.log.gz 2026-03-25T15:45:45.414 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102930.log 2026-03-25T15:45:45.415 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102909.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.102909.log.gz 2026-03-25T15:45:45.415 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102952.log 2026-03-25T15:45:45.415 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102930.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.102930.log.gz 2026-03-25T15:45:45.415 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102973.log 2026-03-25T15:45:45.416 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102952.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.102952.log.gz 2026-03-25T15:45:45.416 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.102995.log 2026-03-25T15:45:45.416 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102973.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.102973.log.gz 2026-03-25T15:45:45.417 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103016.log 2026-03-25T15:45:45.417 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.102995.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.102995.log.gz 2026-03-25T15:45:45.417 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103038.log 2026-03-25T15:45:45.417 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103016.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.103016.log.gz 2026-03-25T15:45:45.418 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103059.log 2026-03-25T15:45:45.418 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103038.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.103038.log.gz 2026-03-25T15:45:45.418 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103081.log 2026-03-25T15:45:45.419 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103059.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.103059.log.gz 2026-03-25T15:45:45.419 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103102.log 2026-03-25T15:45:45.419 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103081.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.103081.log.gz 2026-03-25T15:45:45.420 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103124.log 2026-03-25T15:45:45.420 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103102.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.103102.log.gz 2026-03-25T15:45:45.420 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103145.log 2026-03-25T15:45:45.421 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103124.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.103124.log.gz 2026-03-25T15:45:45.421 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103167.log 2026-03-25T15:45:45.421 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103145.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.103145.log.gz 2026-03-25T15:45:45.422 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103188.log 2026-03-25T15:45:45.422 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103167.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.103167.log.gz 2026-03-25T15:45:45.422 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103210.log 2026-03-25T15:45:45.423 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103188.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.103188.log.gz 2026-03-25T15:45:45.423 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103231.log 2026-03-25T15:45:45.423 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103210.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.103210.log.gz 2026-03-25T15:45:45.423 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103253.log 2026-03-25T15:45:45.424 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103231.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.103231.log.gz 2026-03-25T15:45:45.424 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103274.log 2026-03-25T15:45:45.424 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103253.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.103253.log.gz 2026-03-25T15:45:45.425 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103296.log 2026-03-25T15:45:45.425 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103274.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.103274.log.gz 2026-03-25T15:45:45.425 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103317.log 2026-03-25T15:45:45.425 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103296.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.103296.log.gz 2026-03-25T15:45:45.426 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103339.log 2026-03-25T15:45:45.426 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103317.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.103317.log.gz 2026-03-25T15:45:45.426 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103360.log 2026-03-25T15:45:45.427 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103339.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.103339.log.gz 2026-03-25T15:45:45.427 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103382.log 2026-03-25T15:45:45.427 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103360.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.103360.log.gz 2026-03-25T15:45:45.428 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103403.log 2026-03-25T15:45:45.428 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103382.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.103382.log.gz 2026-03-25T15:45:45.428 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103425.log 2026-03-25T15:45:45.429 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103403.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.103403.log.gz 2026-03-25T15:45:45.429 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103446.log 2026-03-25T15:45:45.429 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103425.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.103425.log.gz 2026-03-25T15:45:45.430 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103468.log 2026-03-25T15:45:45.430 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103446.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.103446.log.gz 2026-03-25T15:45:45.430 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103489.log 2026-03-25T15:45:45.431 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103468.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.103468.log.gz 2026-03-25T15:45:45.431 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103511.log 2026-03-25T15:45:45.431 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103489.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.103489.log.gz 2026-03-25T15:45:45.431 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103532.log 2026-03-25T15:45:45.432 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103511.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.103511.log.gz 2026-03-25T15:45:45.432 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103554.log 2026-03-25T15:45:45.432 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103532.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.103532.log.gz 2026-03-25T15:45:45.432 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103575.log 2026-03-25T15:45:45.433 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103554.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.103554.log.gz 2026-03-25T15:45:45.433 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103596.log 2026-03-25T15:45:45.433 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103575.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.103575.log.gz 2026-03-25T15:45:45.434 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103616.log 2026-03-25T15:45:45.434 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103596.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.103596.log.gz 2026-03-25T15:45:45.434 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103640.log 2026-03-25T15:45:45.435 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103616.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.103616.log.gz 2026-03-25T15:45:45.435 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103660.log 2026-03-25T15:45:45.435 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103640.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.103640.log.gz 2026-03-25T15:45:45.435 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103681.log 2026-03-25T15:45:45.436 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103660.log: 58.4% -- replaced with /var/log/ceph/ceph-client.admin.103660.log.gz 2026-03-25T15:45:45.436 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103701.log 2026-03-25T15:45:45.437 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103681.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.103681.log.gz 2026-03-25T15:45:45.437 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103730.log 2026-03-25T15:45:45.437 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103701.log: 57.9% -- replaced with /var/log/ceph/ceph-client.admin.103701.log.gz 2026-03-25T15:45:45.438 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103754.log 2026-03-25T15:45:45.438 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103730.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.103730.log.gz 2026-03-25T15:45:45.438 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103774.log 2026-03-25T15:45:45.439 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103754.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.103754.log.gz 2026-03-25T15:45:45.439 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103795.log 2026-03-25T15:45:45.439 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103774.log: 54.8% -- replaced with /var/log/ceph/ceph-client.admin.103774.log.gz 2026-03-25T15:45:45.439 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103816.log 2026-03-25T15:45:45.440 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103795.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.103795.log.gz 2026-03-25T15:45:45.440 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103837.log 2026-03-25T15:45:45.441 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103816.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.103816.log.gz 2026-03-25T15:45:45.441 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103859.log 2026-03-25T15:45:45.441 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103837.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.103837.log.gz 2026-03-25T15:45:45.442 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103880.log 2026-03-25T15:45:45.442 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103859.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.103859.log.gz 2026-03-25T15:45:45.442 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103902.log 2026-03-25T15:45:45.443 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103880.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.103880.log.gz 2026-03-25T15:45:45.443 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103923.log 2026-03-25T15:45:45.443 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103902.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.103902.log.gz 2026-03-25T15:45:45.443 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103945.log 2026-03-25T15:45:45.444 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103923.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.103923.log.gz 2026-03-25T15:45:45.444 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103966.log 2026-03-25T15:45:45.444 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103945.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.103945.log.gz 2026-03-25T15:45:45.445 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.103988.log 2026-03-25T15:45:45.445 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103966.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.103966.log.gz 2026-03-25T15:45:45.445 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104009.log 2026-03-25T15:45:45.446 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.103988.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.103988.log.gz 2026-03-25T15:45:45.446 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104031.log 2026-03-25T15:45:45.446 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104009.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.104009.log.gz 2026-03-25T15:45:45.446 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104052.log 2026-03-25T15:45:45.447 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104031.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.104031.log.gz 2026-03-25T15:45:45.447 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104074.log 2026-03-25T15:45:45.447 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104052.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.104052.log.gz 2026-03-25T15:45:45.448 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104095.log 2026-03-25T15:45:45.448 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104074.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.104074.log.gz 2026-03-25T15:45:45.448 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104117.log 2026-03-25T15:45:45.449 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104095.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.104095.log.gz 2026-03-25T15:45:45.449 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104138.log 2026-03-25T15:45:45.449 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104117.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.104117.log.gz 2026-03-25T15:45:45.449 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104160.log 2026-03-25T15:45:45.450 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104138.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.104138.log.gz 2026-03-25T15:45:45.450 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104181.log 2026-03-25T15:45:45.450 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104160.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.104160.log.gz 2026-03-25T15:45:45.450 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104203.log 2026-03-25T15:45:45.451 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104181.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.104181.log.gz 2026-03-25T15:45:45.451 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104224.log 2026-03-25T15:45:45.452 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104203.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.104203.log.gz 2026-03-25T15:45:45.452 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104246.log 2026-03-25T15:45:45.452 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104224.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.104224.log.gz 2026-03-25T15:45:45.452 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104267.log 2026-03-25T15:45:45.453 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104246.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.104246.log.gz 2026-03-25T15:45:45.453 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104289.log 2026-03-25T15:45:45.453 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104267.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.104267.log.gz 2026-03-25T15:45:45.454 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104310.log 2026-03-25T15:45:45.454 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104289.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.104289.log.gz 2026-03-25T15:45:45.454 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104332.log 2026-03-25T15:45:45.455 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104310.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.104310.log.gz 2026-03-25T15:45:45.455 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104353.log 2026-03-25T15:45:45.455 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104332.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.104332.log.gz 2026-03-25T15:45:45.456 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104375.log 2026-03-25T15:45:45.456 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104353.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.104353.log.gz 2026-03-25T15:45:45.456 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104396.log 2026-03-25T15:45:45.456 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104375.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.104375.log.gz 2026-03-25T15:45:45.457 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104418.log 2026-03-25T15:45:45.457 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104396.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.104396.log.gz 2026-03-25T15:45:45.457 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104439.log 2026-03-25T15:45:45.458 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104418.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.104418.log.gz 2026-03-25T15:45:45.458 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104461.log 2026-03-25T15:45:45.458 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104439.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.104439.log.gz 2026-03-25T15:45:45.458 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104482.log 2026-03-25T15:45:45.459 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104461.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.104461.log.gz 2026-03-25T15:45:45.459 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104504.log 2026-03-25T15:45:45.459 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104482.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.104482.log.gz 2026-03-25T15:45:45.460 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104525.log 2026-03-25T15:45:45.460 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104504.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.104504.log.gz 2026-03-25T15:45:45.460 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104547.log 2026-03-25T15:45:45.461 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104525.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.104525.log.gz 2026-03-25T15:45:45.461 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104568.log 2026-03-25T15:45:45.461 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104547.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.104547.log.gz 2026-03-25T15:45:45.461 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104589.log 2026-03-25T15:45:45.462 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104568.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.104568.log.gz 2026-03-25T15:45:45.462 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104608.log 2026-03-25T15:45:45.462 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104589.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.104589.log.gz 2026-03-25T15:45:45.463 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104625.log 2026-03-25T15:45:45.463 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104608.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.104608.log.gz 2026-03-25T15:45:45.463 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104642.log 2026-03-25T15:45:45.464 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104625.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.104625.log.gz 2026-03-25T15:45:45.464 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104659.log 2026-03-25T15:45:45.464 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104642.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.104642.log.gz 2026-03-25T15:45:45.464 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104676.log 2026-03-25T15:45:45.465 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104659.log: 11.8% -- replaced with /var/log/ceph/ceph-client.admin.104659.log.gz 2026-03-25T15:45:45.465 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104696.log 2026-03-25T15:45:45.465 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104676.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.104676.log.gz 2026-03-25T15:45:45.466 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104698.log 2026-03-25T15:45:45.466 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104696.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.104696.log.gz 2026-03-25T15:45:45.466 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104715.log 2026-03-25T15:45:45.467 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104698.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.104698.log.gz 2026-03-25T15:45:45.467 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104736.log 2026-03-25T15:45:45.467 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104715.log: 88.2% -- replaced with /var/log/ceph/ceph-client.admin.104715.log.gz 2026-03-25T15:45:45.467 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104756.log 2026-03-25T15:45:45.468 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104736.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.104736.log.gz 2026-03-25T15:45:45.468 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104776.log 2026-03-25T15:45:45.468 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104756.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.104756.log.gz 2026-03-25T15:45:45.469 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104798.log 2026-03-25T15:45:45.469 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104776.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.104776.log.gz 2026-03-25T15:45:45.469 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104800.log 2026-03-25T15:45:45.470 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104798.log: 57.9% -- replaced with /var/log/ceph/ceph-client.admin.104798.log.gz 2026-03-25T15:45:45.470 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104839.log 2026-03-25T15:45:45.470 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104800.log: 57.9% -- replaced with /var/log/ceph/ceph-client.admin.104800.log.gz 2026-03-25T15:45:45.470 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104860.log 2026-03-25T15:45:45.471 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104839.log: 52.7% -- replaced with /var/log/ceph/ceph-client.admin.104839.log.gz 2026-03-25T15:45:45.471 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104877.log 2026-03-25T15:45:45.471 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104860.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.104860.log.gz 2026-03-25T15:45:45.472 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104898.log 2026-03-25T15:45:45.472 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104877.log: 87.7% -- replaced with /var/log/ceph/ceph-client.admin.104877.log.gz 2026-03-25T15:45:45.472 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104918.log 2026-03-25T15:45:45.473 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104898.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.104898.log.gz 2026-03-25T15:45:45.473 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104938.log 2026-03-25T15:45:45.473 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104918.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.104918.log.gz 2026-03-25T15:45:45.473 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104961.log 2026-03-25T15:45:45.474 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104938.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.104938.log.gz 2026-03-25T15:45:45.474 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.104962.log 2026-03-25T15:45:45.474 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104961.log: 52.7% -- replaced with /var/log/ceph/ceph-client.admin.104961.log.gz 2026-03-25T15:45:45.475 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105001.log 2026-03-25T15:45:45.475 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.104962.log: 56.2% -- replaced with /var/log/ceph/ceph-client.admin.104962.log.gz 2026-03-25T15:45:45.475 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105021.log 2026-03-25T15:45:45.476 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105001.log: 11.1% -- replaced with /var/log/ceph/ceph-client.admin.105001.log.gz 2026-03-25T15:45:45.476 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105042.log 2026-03-25T15:45:45.476 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105021.log: 58.6% -- replaced with /var/log/ceph/ceph-client.admin.105021.log.gz 2026-03-25T15:45:45.476 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105063.log 2026-03-25T15:45:45.477 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105042.log: 58.2% -- replaced with /var/log/ceph/ceph-client.admin.105042.log.gz 2026-03-25T15:45:45.477 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105080.log 2026-03-25T15:45:45.477 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105063.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.105063.log.gz 2026-03-25T15:45:45.478 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105100.log 2026-03-25T15:45:45.478 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105080.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.105080.log.gz 2026-03-25T15:45:45.478 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105120.log 2026-03-25T15:45:45.479 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105100.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.105100.log.gz 2026-03-25T15:45:45.479 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105140.log 2026-03-25T15:45:45.479 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105120.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.105120.log.gz 2026-03-25T15:45:45.479 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105161.log 2026-03-25T15:45:45.480 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105140.log: 25.5% -- replaced with /var/log/ceph/ceph-client.admin.105140.log.gz 2026-03-25T15:45:45.480 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105181.log 2026-03-25T15:45:45.480 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105161.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.105161.log.gz 2026-03-25T15:45:45.481 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105201.log 2026-03-25T15:45:45.481 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105181.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.105181.log.gz 2026-03-25T15:45:45.481 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105222.log 2026-03-25T15:45:45.482 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105201.log: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.105201.log.gz 2026-03-25T15:45:45.482 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105239.log 2026-03-25T15:45:45.482 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105222.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.105222.log.gz 2026-03-25T15:45:45.483 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105256.log 2026-03-25T15:45:45.483 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105239.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.105239.log.gz 2026-03-25T15:45:45.483 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105273.log 2026-03-25T15:45:45.484 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105256.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.105256.log.gz 2026-03-25T15:45:45.484 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105300.log 2026-03-25T15:45:45.484 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105273.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.105273.log.gz 2026-03-25T15:45:45.484 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105327.log 2026-03-25T15:45:45.485 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105300.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.105300.log.gz 2026-03-25T15:45:45.485 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105354.log 2026-03-25T15:45:45.485 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105327.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.105327.log.gz 2026-03-25T15:45:45.486 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105381.log 2026-03-25T15:45:45.486 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105354.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.105354.log.gz 2026-03-25T15:45:45.486 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105398.log 2026-03-25T15:45:45.487 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105381.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.105381.log.gz 2026-03-25T15:45:45.487 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105420.log 2026-03-25T15:45:45.487 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105398.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.105398.log.gz 2026-03-25T15:45:45.487 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105438.log 2026-03-25T15:45:45.488 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105420.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.105420.log.gz 2026-03-25T15:45:45.488 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105459.log 2026-03-25T15:45:45.488 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105438.log: 54.4% -- replaced with /var/log/ceph/ceph-client.admin.105438.log.gz 2026-03-25T15:45:45.489 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105480.log 2026-03-25T15:45:45.489 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105459.log: 25.7% -- replaced with /var/log/ceph/ceph-client.admin.105459.log.gz 2026-03-25T15:45:45.489 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105497.log 2026-03-25T15:45:45.490 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105480.log: 29.3% -- replaced with /var/log/ceph/ceph-client.admin.105480.log.gz 2026-03-25T15:45:45.490 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105514.log 2026-03-25T15:45:45.490 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105497.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.105497.log.gz 2026-03-25T15:45:45.490 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105534.log 2026-03-25T15:45:45.491 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105514.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.105514.log.gz 2026-03-25T15:45:45.491 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105552.log 2026-03-25T15:45:45.491 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105534.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.105534.log.gz 2026-03-25T15:45:45.492 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105573.log 2026-03-25T15:45:45.492 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105552.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.105552.log.gz 2026-03-25T15:45:45.492 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105595.log 2026-03-25T15:45:45.492 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105573.log: 29.7% -- replaced with /var/log/ceph/ceph-client.admin.105573.log.gz 2026-03-25T15:45:45.493 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105616.log 2026-03-25T15:45:45.493 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105595.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.105595.log.gz 2026-03-25T15:45:45.493 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105638.log 2026-03-25T15:45:45.494 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105616.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.105616.log.gz 2026-03-25T15:45:45.494 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105659.log 2026-03-25T15:45:45.494 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105638.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.105638.log.gz 2026-03-25T15:45:45.494 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105681.log 2026-03-25T15:45:45.495 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105659.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.105659.log.gz 2026-03-25T15:45:45.495 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105702.log 2026-03-25T15:45:45.495 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105681.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.105681.log.gz 2026-03-25T15:45:45.496 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105724.log 2026-03-25T15:45:45.496 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105702.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.105702.log.gz 2026-03-25T15:45:45.496 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105745.log 2026-03-25T15:45:45.497 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105724.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.105724.log.gz 2026-03-25T15:45:45.497 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105767.log 2026-03-25T15:45:45.497 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105745.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.105745.log.gz 2026-03-25T15:45:45.497 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105788.log 2026-03-25T15:45:45.498 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105767.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.105767.log.gz 2026-03-25T15:45:45.498 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105810.log 2026-03-25T15:45:45.498 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105788.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.105788.log.gz 2026-03-25T15:45:45.499 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105831.log 2026-03-25T15:45:45.499 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105810.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.105810.log.gz 2026-03-25T15:45:45.499 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105853.log 2026-03-25T15:45:45.500 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105831.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.105831.log.gz 2026-03-25T15:45:45.500 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105874.log 2026-03-25T15:45:45.500 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105853.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.105853.log.gz 2026-03-25T15:45:45.500 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105896.log 2026-03-25T15:45:45.501 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105874.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.105874.log.gz 2026-03-25T15:45:45.501 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105917.log 2026-03-25T15:45:45.502 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105896.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.105896.log.gz 2026-03-25T15:45:45.502 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105939.log 2026-03-25T15:45:45.502 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105917.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.105917.log.gz 2026-03-25T15:45:45.502 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105960.log 2026-03-25T15:45:45.503 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105939.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.105939.log.gz 2026-03-25T15:45:45.503 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.105982.log 2026-03-25T15:45:45.503 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105960.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.105960.log.gz 2026-03-25T15:45:45.503 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106003.log 2026-03-25T15:45:45.504 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.105982.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.105982.log.gz 2026-03-25T15:45:45.504 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106025.log 2026-03-25T15:45:45.504 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106003.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106003.log.gz 2026-03-25T15:45:45.505 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106046.log 2026-03-25T15:45:45.505 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106025.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106025.log.gz 2026-03-25T15:45:45.505 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106068.log 2026-03-25T15:45:45.505 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106046.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106046.log.gz 2026-03-25T15:45:45.506 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106089.log 2026-03-25T15:45:45.506 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106068.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106068.log.gz 2026-03-25T15:45:45.506 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106111.log 2026-03-25T15:45:45.507 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106089.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106089.log.gz 2026-03-25T15:45:45.507 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106132.log 2026-03-25T15:45:45.507 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106111.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106111.log.gz 2026-03-25T15:45:45.507 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106154.log 2026-03-25T15:45:45.508 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106132.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106132.log.gz 2026-03-25T15:45:45.508 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106175.log 2026-03-25T15:45:45.508 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106154.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106154.log.gz 2026-03-25T15:45:45.509 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106197.log 2026-03-25T15:45:45.509 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106175.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106175.log.gz 2026-03-25T15:45:45.509 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106218.log 2026-03-25T15:45:45.510 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106197.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106197.log.gz 2026-03-25T15:45:45.510 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106240.log 2026-03-25T15:45:45.510 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106218.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106218.log.gz 2026-03-25T15:45:45.510 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106261.log 2026-03-25T15:45:45.511 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106240.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106240.log.gz 2026-03-25T15:45:45.511 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106283.log 2026-03-25T15:45:45.511 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106261.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106261.log.gz 2026-03-25T15:45:45.512 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106304.log 2026-03-25T15:45:45.512 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106283.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106283.log.gz 2026-03-25T15:45:45.512 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106325.log 2026-03-25T15:45:45.512 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106304.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106304.log.gz 2026-03-25T15:45:45.513 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106351.log 2026-03-25T15:45:45.513 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106325.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106325.log.gz 2026-03-25T15:45:45.513 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106368.log 2026-03-25T15:45:45.514 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106351.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106351.log.gz 2026-03-25T15:45:45.514 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106385.log 2026-03-25T15:45:45.514 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106368.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106368.log.gz 2026-03-25T15:45:45.514 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106409.log 2026-03-25T15:45:45.515 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106385.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106385.log.gz 2026-03-25T15:45:45.515 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106434.log 2026-03-25T15:45:45.515 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106409.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106409.log.gz 2026-03-25T15:45:45.516 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106451.log 2026-03-25T15:45:45.516 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106434.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106434.log.gz 2026-03-25T15:45:45.516 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106468.log 2026-03-25T15:45:45.517 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106451.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106451.log.gz 2026-03-25T15:45:45.517 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106485.log 2026-03-25T15:45:45.517 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106468.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106468.log.gz 2026-03-25T15:45:45.517 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106502.log 2026-03-25T15:45:45.518 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106485.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106485.log.gz 2026-03-25T15:45:45.518 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106519.log 2026-03-25T15:45:45.518 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106502.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106502.log.gz 2026-03-25T15:45:45.519 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106536.log 2026-03-25T15:45:45.519 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106519.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106519.log.gz 2026-03-25T15:45:45.519 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106553.log 2026-03-25T15:45:45.519 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106536.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106536.log.gz 2026-03-25T15:45:45.520 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106571.log 2026-03-25T15:45:45.520 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106553.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106553.log.gz 2026-03-25T15:45:45.520 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106588.log 2026-03-25T15:45:45.521 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106571.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106571.log.gz 2026-03-25T15:45:45.521 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106606.log 2026-03-25T15:45:45.521 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106588.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106588.log.gz 2026-03-25T15:45:45.521 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106624.log 2026-03-25T15:45:45.522 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106606.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106606.log.gz 2026-03-25T15:45:45.522 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106641.log 2026-03-25T15:45:45.522 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106624.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106624.log.gz 2026-03-25T15:45:45.523 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106658.log 2026-03-25T15:45:45.523 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106641.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106641.log.gz 2026-03-25T15:45:45.523 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106675.log 2026-03-25T15:45:45.524 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106658.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106658.log.gz 2026-03-25T15:45:45.524 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106692.log 2026-03-25T15:45:45.524 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106675.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106675.log.gz 2026-03-25T15:45:45.524 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106710.log 2026-03-25T15:45:45.525 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106692.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106692.log.gz 2026-03-25T15:45:45.525 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106727.log 2026-03-25T15:45:45.525 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106710.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106710.log.gz 2026-03-25T15:45:45.525 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106746.log 2026-03-25T15:45:45.526 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106727.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106727.log.gz 2026-03-25T15:45:45.526 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106766.log 2026-03-25T15:45:45.526 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106746.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106746.log.gz 2026-03-25T15:45:45.527 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106786.log 2026-03-25T15:45:45.527 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106766.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106766.log.gz 2026-03-25T15:45:45.527 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106804.log 2026-03-25T15:45:45.528 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106786.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106786.log.gz 2026-03-25T15:45:45.528 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106822.log 2026-03-25T15:45:45.528 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106804.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106804.log.gz 2026-03-25T15:45:45.528 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106841.log 2026-03-25T15:45:45.529 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106822.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106822.log.gz 2026-03-25T15:45:45.529 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106859.log 2026-03-25T15:45:45.529 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106841.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106841.log.gz 2026-03-25T15:45:45.530 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106876.log 2026-03-25T15:45:45.530 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106859.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106859.log.gz 2026-03-25T15:45:45.530 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106894.log 2026-03-25T15:45:45.530 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106876.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106876.log.gz 2026-03-25T15:45:45.531 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106912.log 2026-03-25T15:45:45.531 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106894.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106894.log.gz 2026-03-25T15:45:45.531 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106929.log 2026-03-25T15:45:45.532 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106912.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106912.log.gz 2026-03-25T15:45:45.532 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106947.log 2026-03-25T15:45:45.532 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106929.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106929.log.gz 2026-03-25T15:45:45.532 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106966.log 2026-03-25T15:45:45.533 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106947.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106947.log.gz 2026-03-25T15:45:45.533 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.106985.log 2026-03-25T15:45:45.533 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106966.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106966.log.gz 2026-03-25T15:45:45.534 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107004.log 2026-03-25T15:45:45.534 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.106985.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.106985.log.gz 2026-03-25T15:45:45.534 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107023.log 2026-03-25T15:45:45.534 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107004.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107004.log.gz 2026-03-25T15:45:45.535 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107043.log 2026-03-25T15:45:45.535 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107023.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107023.log.gz 2026-03-25T15:45:45.535 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107063.log 2026-03-25T15:45:45.536 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107043.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107043.log.gz 2026-03-25T15:45:45.536 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107083.log 2026-03-25T15:45:45.536 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107063.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107063.log.gz 2026-03-25T15:45:45.536 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107103.log 2026-03-25T15:45:45.537 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107083.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107083.log.gz 2026-03-25T15:45:45.537 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107123.log 2026-03-25T15:45:45.537 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107103.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107103.log.gz 2026-03-25T15:45:45.538 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107143.log 2026-03-25T15:45:45.538 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107123.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107123.log.gz 2026-03-25T15:45:45.538 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107162.log 2026-03-25T15:45:45.539 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107143.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107143.log.gz 2026-03-25T15:45:45.539 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107179.log 2026-03-25T15:45:45.539 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107162.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107162.log.gz 2026-03-25T15:45:45.539 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107201.log 2026-03-25T15:45:45.540 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107179.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107179.log.gz 2026-03-25T15:45:45.540 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107220.log 2026-03-25T15:45:45.540 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107201.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107201.log.gz 2026-03-25T15:45:45.541 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107240.log 2026-03-25T15:45:45.541 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107220.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107220.log.gz 2026-03-25T15:45:45.541 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107260.log 2026-03-25T15:45:45.542 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107240.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107240.log.gz 2026-03-25T15:45:45.542 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107278.log 2026-03-25T15:45:45.542 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107260.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107260.log.gz 2026-03-25T15:45:45.542 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107295.log 2026-03-25T15:45:45.543 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107278.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107278.log.gz 2026-03-25T15:45:45.543 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107313.log 2026-03-25T15:45:45.543 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107295.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107295.log.gz 2026-03-25T15:45:45.544 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107331.log 2026-03-25T15:45:45.544 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107313.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107313.log.gz 2026-03-25T15:45:45.544 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107350.log 2026-03-25T15:45:45.544 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107331.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107331.log.gz 2026-03-25T15:45:45.545 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107367.log 2026-03-25T15:45:45.545 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107350.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107350.log.gz 2026-03-25T15:45:45.545 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107385.log 2026-03-25T15:45:45.546 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107367.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107367.log.gz 2026-03-25T15:45:45.546 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107402.log 2026-03-25T15:45:45.546 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107385.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107385.log.gz 2026-03-25T15:45:45.546 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107419.log 2026-03-25T15:45:45.547 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107402.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107402.log.gz 2026-03-25T15:45:45.547 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107436.log 2026-03-25T15:45:45.547 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107419.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107419.log.gz 2026-03-25T15:45:45.548 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107457.log 2026-03-25T15:45:45.548 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107436.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107436.log.gz 2026-03-25T15:45:45.548 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107476.log 2026-03-25T15:45:45.548 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107457.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107457.log.gz 2026-03-25T15:45:45.549 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107493.log 2026-03-25T15:45:45.549 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107476.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107476.log.gz 2026-03-25T15:45:45.549 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107511.log 2026-03-25T15:45:45.550 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107493.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107493.log.gz 2026-03-25T15:45:45.550 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107530.log 2026-03-25T15:45:45.550 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107511.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107511.log.gz 2026-03-25T15:45:45.550 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107554.log 2026-03-25T15:45:45.551 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107530.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107530.log.gz 2026-03-25T15:45:45.551 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107573.log 2026-03-25T15:45:45.551 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107554.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107554.log.gz 2026-03-25T15:45:45.552 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107592.log 2026-03-25T15:45:45.552 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107573.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107573.log.gz 2026-03-25T15:45:45.552 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107610.log 2026-03-25T15:45:45.553 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107592.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107592.log.gz 2026-03-25T15:45:45.553 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107628.log 2026-03-25T15:45:45.553 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107610.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107610.log.gz 2026-03-25T15:45:45.553 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107646.log 2026-03-25T15:45:45.554 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107628.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107628.log.gz 2026-03-25T15:45:45.554 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107664.log 2026-03-25T15:45:45.554 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107646.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107646.log.gz 2026-03-25T15:45:45.555 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107682.log 2026-03-25T15:45:45.555 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107664.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107664.log.gz 2026-03-25T15:45:45.555 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107699.log 2026-03-25T15:45:45.556 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107682.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107682.log.gz 2026-03-25T15:45:45.556 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107716.log 2026-03-25T15:45:45.556 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107699.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107699.log.gz 2026-03-25T15:45:45.556 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107737.log 2026-03-25T15:45:45.557 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107716.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107716.log.gz 2026-03-25T15:45:45.557 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107756.log 2026-03-25T15:45:45.557 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107737.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107737.log.gz 2026-03-25T15:45:45.557 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107773.log 2026-03-25T15:45:45.558 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107756.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107756.log.gz 2026-03-25T15:45:45.558 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107791.log 2026-03-25T15:45:45.558 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107773.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107773.log.gz 2026-03-25T15:45:45.559 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107810.log 2026-03-25T15:45:45.559 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107791.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107791.log.gz 2026-03-25T15:45:45.559 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107831.log 2026-03-25T15:45:45.560 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107810.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107810.log.gz 2026-03-25T15:45:45.560 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107851.log 2026-03-25T15:45:45.560 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107831.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107831.log.gz 2026-03-25T15:45:45.560 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107871.log 2026-03-25T15:45:45.561 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107851.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107851.log.gz 2026-03-25T15:45:45.561 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107891.log 2026-03-25T15:45:45.562 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107871.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107871.log.gz 2026-03-25T15:45:45.562 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107911.log 2026-03-25T15:45:45.562 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107891.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107891.log.gz 2026-03-25T15:45:45.562 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107931.log 2026-03-25T15:45:45.563 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107911.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107911.log.gz 2026-03-25T15:45:45.563 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107950.log 2026-03-25T15:45:45.563 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107931.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107931.log.gz 2026-03-25T15:45:45.564 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107969.log 2026-03-25T15:45:45.564 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107950.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107950.log.gz 2026-03-25T15:45:45.564 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.107987.log 2026-03-25T15:45:45.564 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107969.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107969.log.gz 2026-03-25T15:45:45.565 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108005.log 2026-03-25T15:45:45.565 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.107987.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.107987.log.gz 2026-03-25T15:45:45.565 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108023.log 2026-03-25T15:45:45.566 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108005.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108005.log.gz 2026-03-25T15:45:45.566 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108041.log 2026-03-25T15:45:45.566 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108023.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108023.log.gz 2026-03-25T15:45:45.566 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108059.log 2026-03-25T15:45:45.567 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108041.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108041.log.gz 2026-03-25T15:45:45.567 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108076.log 2026-03-25T15:45:45.567 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108059.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108059.log.gz 2026-03-25T15:45:45.568 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108093.log 2026-03-25T15:45:45.568 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108076.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108076.log.gz 2026-03-25T15:45:45.568 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108110.log 2026-03-25T15:45:45.569 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108093.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108093.log.gz 2026-03-25T15:45:45.569 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108127.log 2026-03-25T15:45:45.569 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108110.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108110.log.gz 2026-03-25T15:45:45.569 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108144.log 2026-03-25T15:45:45.570 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108127.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108127.log.gz 2026-03-25T15:45:45.570 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108161.log 2026-03-25T15:45:45.570 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108144.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108144.log.gz 2026-03-25T15:45:45.571 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108178.log 2026-03-25T15:45:45.571 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108161.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108161.log.gz 2026-03-25T15:45:45.571 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108195.log 2026-03-25T15:45:45.572 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108178.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108178.log.gz 2026-03-25T15:45:45.572 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108212.log 2026-03-25T15:45:45.572 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108195.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108195.log.gz 2026-03-25T15:45:45.572 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108229.log 2026-03-25T15:45:45.573 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108212.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108212.log.gz 2026-03-25T15:45:45.573 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108247.log 2026-03-25T15:45:45.574 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108229.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108229.log.gz 2026-03-25T15:45:45.574 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108265.log 2026-03-25T15:45:45.574 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108247.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108247.log.gz 2026-03-25T15:45:45.574 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108282.log 2026-03-25T15:45:45.575 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108265.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108265.log.gz 2026-03-25T15:45:45.575 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108299.log 2026-03-25T15:45:45.575 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108282.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108282.log.gz 2026-03-25T15:45:45.575 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108317.log 2026-03-25T15:45:45.576 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108299.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108299.log.gz 2026-03-25T15:45:45.576 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108338.log 2026-03-25T15:45:45.576 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108317.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108317.log.gz 2026-03-25T15:45:45.577 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108360.log 2026-03-25T15:45:45.577 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108338.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108338.log.gz 2026-03-25T15:45:45.577 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108381.log 2026-03-25T15:45:45.577 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108360.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108360.log.gz 2026-03-25T15:45:45.578 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108403.log 2026-03-25T15:45:45.578 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108381.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108381.log.gz 2026-03-25T15:45:45.578 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108424.log 2026-03-25T15:45:45.579 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108403.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108403.log.gz 2026-03-25T15:45:45.579 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108446.log 2026-03-25T15:45:45.579 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108424.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108424.log.gz 2026-03-25T15:45:45.579 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108467.log 2026-03-25T15:45:45.580 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108446.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108446.log.gz 2026-03-25T15:45:45.580 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108489.log 2026-03-25T15:45:45.580 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108467.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108467.log.gz 2026-03-25T15:45:45.581 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108510.log 2026-03-25T15:45:45.581 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108489.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108489.log.gz 2026-03-25T15:45:45.581 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108532.log 2026-03-25T15:45:45.582 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108510.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108510.log.gz 2026-03-25T15:45:45.582 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108553.log 2026-03-25T15:45:45.582 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108532.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108532.log.gz 2026-03-25T15:45:45.583 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108575.log 2026-03-25T15:45:45.583 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108553.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108553.log.gz 2026-03-25T15:45:45.583 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108596.log 2026-03-25T15:45:45.584 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108575.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108575.log.gz 2026-03-25T15:45:45.584 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108618.log 2026-03-25T15:45:45.584 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108596.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108596.log.gz 2026-03-25T15:45:45.584 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108639.log 2026-03-25T15:45:45.585 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108618.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108618.log.gz 2026-03-25T15:45:45.585 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108661.log 2026-03-25T15:45:45.585 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108639.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108639.log.gz 2026-03-25T15:45:45.586 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108682.log 2026-03-25T15:45:45.586 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108661.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108661.log.gz 2026-03-25T15:45:45.586 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108704.log 2026-03-25T15:45:45.586 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108682.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108682.log.gz 2026-03-25T15:45:45.587 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108725.log 2026-03-25T15:45:45.587 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108704.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108704.log.gz 2026-03-25T15:45:45.587 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108747.log 2026-03-25T15:45:45.588 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108725.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108725.log.gz 2026-03-25T15:45:45.588 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108768.log 2026-03-25T15:45:45.588 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108747.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108747.log.gz 2026-03-25T15:45:45.588 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108790.log 2026-03-25T15:45:45.589 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108768.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108768.log.gz 2026-03-25T15:45:45.589 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108811.log 2026-03-25T15:45:45.589 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108790.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108790.log.gz 2026-03-25T15:45:45.590 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108833.log 2026-03-25T15:45:45.590 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108811.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108811.log.gz 2026-03-25T15:45:45.590 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108854.log 2026-03-25T15:45:45.591 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108833.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108833.log.gz 2026-03-25T15:45:45.591 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108876.log 2026-03-25T15:45:45.591 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108854.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108854.log.gz 2026-03-25T15:45:45.591 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108897.log 2026-03-25T15:45:45.592 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108876.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108876.log.gz 2026-03-25T15:45:45.592 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108919.log 2026-03-25T15:45:45.592 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108897.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108897.log.gz 2026-03-25T15:45:45.593 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108940.log 2026-03-25T15:45:45.593 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108919.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108919.log.gz 2026-03-25T15:45:45.593 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108962.log 2026-03-25T15:45:45.593 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108940.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108940.log.gz 2026-03-25T15:45:45.594 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.108983.log 2026-03-25T15:45:45.594 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108962.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108962.log.gz 2026-03-25T15:45:45.594 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109005.log 2026-03-25T15:45:45.595 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.108983.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.108983.log.gz 2026-03-25T15:45:45.595 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109026.log 2026-03-25T15:45:45.595 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109005.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109005.log.gz 2026-03-25T15:45:45.595 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109048.log 2026-03-25T15:45:45.596 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109026.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109026.log.gz 2026-03-25T15:45:45.596 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109069.log 2026-03-25T15:45:45.596 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109048.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109048.log.gz 2026-03-25T15:45:45.597 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109090.log 2026-03-25T15:45:45.597 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109069.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109069.log.gz 2026-03-25T15:45:45.597 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109115.log 2026-03-25T15:45:45.598 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109090.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109090.log.gz 2026-03-25T15:45:45.598 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109136.log 2026-03-25T15:45:45.598 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109115.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109115.log.gz 2026-03-25T15:45:45.599 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109158.log 2026-03-25T15:45:45.599 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109136.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109136.log.gz 2026-03-25T15:45:45.599 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109179.log 2026-03-25T15:45:45.599 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109158.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109158.log.gz 2026-03-25T15:45:45.600 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109201.log 2026-03-25T15:45:45.600 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109179.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109179.log.gz 2026-03-25T15:45:45.600 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109222.log 2026-03-25T15:45:45.601 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109201.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109201.log.gz 2026-03-25T15:45:45.601 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109244.log 2026-03-25T15:45:45.601 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109222.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109222.log.gz 2026-03-25T15:45:45.602 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109265.log 2026-03-25T15:45:45.602 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109244.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109244.log.gz 2026-03-25T15:45:45.602 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109287.log 2026-03-25T15:45:45.603 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109265.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109265.log.gz 2026-03-25T15:45:45.603 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109308.log 2026-03-25T15:45:45.603 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109287.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109287.log.gz 2026-03-25T15:45:45.603 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109330.log 2026-03-25T15:45:45.604 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109308.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109308.log.gz 2026-03-25T15:45:45.604 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109351.log 2026-03-25T15:45:45.604 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109330.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109330.log.gz 2026-03-25T15:45:45.605 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109373.log 2026-03-25T15:45:45.605 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109351.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109351.log.gz 2026-03-25T15:45:45.605 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109394.log 2026-03-25T15:45:45.605 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109373.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109373.log.gz 2026-03-25T15:45:45.606 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109416.log 2026-03-25T15:45:45.606 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109394.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109394.log.gz 2026-03-25T15:45:45.606 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109437.log 2026-03-25T15:45:45.607 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109416.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109416.log.gz 2026-03-25T15:45:45.607 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109459.log 2026-03-25T15:45:45.607 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109437.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109437.log.gz 2026-03-25T15:45:45.607 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109480.log 2026-03-25T15:45:45.608 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109459.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109459.log.gz 2026-03-25T15:45:45.608 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109502.log 2026-03-25T15:45:45.608 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109480.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109480.log.gz 2026-03-25T15:45:45.609 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109523.log 2026-03-25T15:45:45.609 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109502.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109502.log.gz 2026-03-25T15:45:45.609 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109545.log 2026-03-25T15:45:45.610 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109523.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109523.log.gz 2026-03-25T15:45:45.610 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109566.log 2026-03-25T15:45:45.610 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109545.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109545.log.gz 2026-03-25T15:45:45.610 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109588.log 2026-03-25T15:45:45.611 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109566.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109566.log.gz 2026-03-25T15:45:45.611 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109609.log 2026-03-25T15:45:45.611 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109588.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109588.log.gz 2026-03-25T15:45:45.612 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109631.log 2026-03-25T15:45:45.612 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109609.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109609.log.gz 2026-03-25T15:45:45.612 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109652.log 2026-03-25T15:45:45.612 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109631.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109631.log.gz 2026-03-25T15:45:45.613 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109674.log 2026-03-25T15:45:45.613 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109652.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109652.log.gz 2026-03-25T15:45:45.613 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109695.log 2026-03-25T15:45:45.614 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109674.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109674.log.gz 2026-03-25T15:45:45.614 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109717.log 2026-03-25T15:45:45.614 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109695.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109695.log.gz 2026-03-25T15:45:45.614 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109738.log 2026-03-25T15:45:45.615 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109717.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109717.log.gz 2026-03-25T15:45:45.615 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109760.log 2026-03-25T15:45:45.615 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109738.log: 27.2% -- replaced with /var/log/ceph/ceph-client.admin.109738.log.gz 2026-03-25T15:45:45.616 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109781.log 2026-03-25T15:45:45.616 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109760.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109760.log.gz 2026-03-25T15:45:45.616 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109803.log 2026-03-25T15:45:45.617 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109781.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109781.log.gz 2026-03-25T15:45:45.617 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109824.log 2026-03-25T15:45:45.617 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109803.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109803.log.gz 2026-03-25T15:45:45.617 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109846.log 2026-03-25T15:45:45.618 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109824.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109824.log.gz 2026-03-25T15:45:45.618 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109867.log 2026-03-25T15:45:45.618 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109846.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109846.log.gz 2026-03-25T15:45:45.619 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109888.log 2026-03-25T15:45:45.619 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109867.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109867.log.gz 2026-03-25T15:45:45.619 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109914.log 2026-03-25T15:45:45.620 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109888.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109888.log.gz 2026-03-25T15:45:45.620 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109931.log 2026-03-25T15:45:45.620 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109914.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109914.log.gz 2026-03-25T15:45:45.620 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109948.log 2026-03-25T15:45:45.621 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109931.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109931.log.gz 2026-03-25T15:45:45.621 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109965.log 2026-03-25T15:45:45.622 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109948.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109948.log.gz 2026-03-25T15:45:45.622 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.109984.log 2026-03-25T15:45:45.622 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109965.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109965.log.gz 2026-03-25T15:45:45.623 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110011.log 2026-03-25T15:45:45.623 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.109984.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.109984.log.gz 2026-03-25T15:45:45.623 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110035.log 2026-03-25T15:45:45.623 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110011.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110011.log.gz 2026-03-25T15:45:45.624 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110055.log 2026-03-25T15:45:45.624 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110035.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110035.log.gz 2026-03-25T15:45:45.624 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110073.log 2026-03-25T15:45:45.625 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110055.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110055.log.gz 2026-03-25T15:45:45.625 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110091.log 2026-03-25T15:45:45.625 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110073.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110073.log.gz 2026-03-25T15:45:45.626 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110123.log 2026-03-25T15:45:45.626 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110091.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110091.log.gz 2026-03-25T15:45:45.626 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110140.log 2026-03-25T15:45:45.627 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110123.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110123.log.gz 2026-03-25T15:45:45.627 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110158.log 2026-03-25T15:45:45.627 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110140.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110140.log.gz 2026-03-25T15:45:45.627 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110176.log 2026-03-25T15:45:45.628 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110158.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110158.log.gz 2026-03-25T15:45:45.628 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110193.log 2026-03-25T15:45:45.629 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110176.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110176.log.gz 2026-03-25T15:45:45.629 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110210.log 2026-03-25T15:45:45.629 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110193.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110193.log.gz 2026-03-25T15:45:45.629 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110229.log 2026-03-25T15:45:45.630 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110210.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110210.log.gz 2026-03-25T15:45:45.630 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110248.log 2026-03-25T15:45:45.630 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110229.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110229.log.gz 2026-03-25T15:45:45.631 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110273.log 2026-03-25T15:45:45.631 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110248.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110248.log.gz 2026-03-25T15:45:45.631 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110294.log 2026-03-25T15:45:45.632 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110273.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110273.log.gz 2026-03-25T15:45:45.632 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110316.log 2026-03-25T15:45:45.632 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110294.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110294.log.gz 2026-03-25T15:45:45.632 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110337.log 2026-03-25T15:45:45.633 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110316.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110316.log.gz 2026-03-25T15:45:45.633 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110359.log 2026-03-25T15:45:45.633 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110337.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110337.log.gz 2026-03-25T15:45:45.634 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110380.log 2026-03-25T15:45:45.634 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110359.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110359.log.gz 2026-03-25T15:45:45.634 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110402.log 2026-03-25T15:45:45.635 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110380.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110380.log.gz 2026-03-25T15:45:45.635 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110423.log 2026-03-25T15:45:45.635 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110402.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110402.log.gz 2026-03-25T15:45:45.635 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110445.log 2026-03-25T15:45:45.636 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110423.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110423.log.gz 2026-03-25T15:45:45.636 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110466.log 2026-03-25T15:45:45.636 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110445.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110445.log.gz 2026-03-25T15:45:45.637 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110488.log 2026-03-25T15:45:45.637 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110466.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110466.log.gz 2026-03-25T15:45:45.637 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110509.log 2026-03-25T15:45:45.638 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110488.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110488.log.gz 2026-03-25T15:45:45.638 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110531.log 2026-03-25T15:45:45.638 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110509.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110509.log.gz 2026-03-25T15:45:45.638 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110552.log 2026-03-25T15:45:45.639 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110531.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110531.log.gz 2026-03-25T15:45:45.639 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110574.log 2026-03-25T15:45:45.639 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110552.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110552.log.gz 2026-03-25T15:45:45.640 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110595.log 2026-03-25T15:45:45.640 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110574.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110574.log.gz 2026-03-25T15:45:45.640 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110617.log 2026-03-25T15:45:45.641 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110595.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110595.log.gz 2026-03-25T15:45:45.641 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110638.log 2026-03-25T15:45:45.641 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110617.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110617.log.gz 2026-03-25T15:45:45.641 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110660.log 2026-03-25T15:45:45.642 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110638.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110638.log.gz 2026-03-25T15:45:45.642 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110681.log 2026-03-25T15:45:45.642 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110660.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110660.log.gz 2026-03-25T15:45:45.643 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110703.log 2026-03-25T15:45:45.643 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110681.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110681.log.gz 2026-03-25T15:45:45.643 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110724.log 2026-03-25T15:45:45.644 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110703.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110703.log.gz 2026-03-25T15:45:45.644 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110746.log 2026-03-25T15:45:45.644 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110724.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110724.log.gz 2026-03-25T15:45:45.644 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110767.log 2026-03-25T15:45:45.645 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110746.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110746.log.gz 2026-03-25T15:45:45.645 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110789.log 2026-03-25T15:45:45.645 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110767.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110767.log.gz 2026-03-25T15:45:45.646 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110810.log 2026-03-25T15:45:45.646 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110789.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110789.log.gz 2026-03-25T15:45:45.646 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110832.log 2026-03-25T15:45:45.647 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110810.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110810.log.gz 2026-03-25T15:45:45.647 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110853.log 2026-03-25T15:45:45.647 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110832.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110832.log.gz 2026-03-25T15:45:45.648 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110875.log 2026-03-25T15:45:45.648 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110853.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110853.log.gz 2026-03-25T15:45:45.648 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110896.log 2026-03-25T15:45:45.648 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110875.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110875.log.gz 2026-03-25T15:45:45.649 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110918.log 2026-03-25T15:45:45.649 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110896.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110896.log.gz 2026-03-25T15:45:45.649 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110939.log 2026-03-25T15:45:45.650 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110918.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110918.log.gz 2026-03-25T15:45:45.650 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110961.log 2026-03-25T15:45:45.650 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110939.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110939.log.gz 2026-03-25T15:45:45.650 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.110982.log 2026-03-25T15:45:45.651 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110961.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110961.log.gz 2026-03-25T15:45:45.651 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111004.log 2026-03-25T15:45:45.651 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.110982.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.110982.log.gz 2026-03-25T15:45:45.652 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111025.log 2026-03-25T15:45:45.652 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111004.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111004.log.gz 2026-03-25T15:45:45.652 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111046.log 2026-03-25T15:45:45.653 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111025.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111025.log.gz 2026-03-25T15:45:45.653 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111072.log 2026-03-25T15:45:45.653 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111046.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111046.log.gz 2026-03-25T15:45:45.654 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111089.log 2026-03-25T15:45:45.654 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111072.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111072.log.gz 2026-03-25T15:45:45.654 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111106.log 2026-03-25T15:45:45.655 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111089.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111089.log.gz 2026-03-25T15:45:45.655 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111123.log 2026-03-25T15:45:45.655 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111106.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111106.log.gz 2026-03-25T15:45:45.655 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111140.log 2026-03-25T15:45:45.656 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111123.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111123.log.gz 2026-03-25T15:45:45.656 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111157.log 2026-03-25T15:45:45.656 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111140.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111140.log.gz 2026-03-25T15:45:45.657 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111181.log 2026-03-25T15:45:45.657 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111157.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111157.log.gz 2026-03-25T15:45:45.657 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111206.log 2026-03-25T15:45:45.657 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111181.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111181.log.gz 2026-03-25T15:45:45.658 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111223.log 2026-03-25T15:45:45.658 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111206.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111206.log.gz 2026-03-25T15:45:45.658 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111240.log 2026-03-25T15:45:45.659 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111223.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111223.log.gz 2026-03-25T15:45:45.659 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111258.log 2026-03-25T15:45:45.659 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111240.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111240.log.gz 2026-03-25T15:45:45.660 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111279.log 2026-03-25T15:45:45.660 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111258.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111258.log.gz 2026-03-25T15:45:45.660 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111300.log 2026-03-25T15:45:45.661 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111279.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111279.log.gz 2026-03-25T15:45:45.661 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111321.log 2026-03-25T15:45:45.661 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111300.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111300.log.gz 2026-03-25T15:45:45.662 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111338.log 2026-03-25T15:45:45.662 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111321.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111321.log.gz 2026-03-25T15:45:45.662 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111355.log 2026-03-25T15:45:45.663 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111338.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111338.log.gz 2026-03-25T15:45:45.663 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111372.log 2026-03-25T15:45:45.663 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111355.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111355.log.gz 2026-03-25T15:45:45.664 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111389.log 2026-03-25T15:45:45.664 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111372.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111372.log.gz 2026-03-25T15:45:45.664 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111406.log 2026-03-25T15:45:45.665 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111389.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111389.log.gz 2026-03-25T15:45:45.665 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111423.log 2026-03-25T15:45:45.665 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111406.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111406.log.gz 2026-03-25T15:45:45.665 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111441.log 2026-03-25T15:45:45.666 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111423.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111423.log.gz 2026-03-25T15:45:45.666 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111458.log 2026-03-25T15:45:45.666 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111441.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111441.log.gz 2026-03-25T15:45:45.667 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111476.log 2026-03-25T15:45:45.667 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111458.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111458.log.gz 2026-03-25T15:45:45.667 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111493.log 2026-03-25T15:45:45.667 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111476.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111476.log.gz 2026-03-25T15:45:45.668 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111511.log 2026-03-25T15:45:45.668 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111493.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111493.log.gz 2026-03-25T15:45:45.668 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111530.log 2026-03-25T15:45:45.669 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111511.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111511.log.gz 2026-03-25T15:45:45.669 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111553.log 2026-03-25T15:45:45.669 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111530.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111530.log.gz 2026-03-25T15:45:45.669 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111576.log 2026-03-25T15:45:45.670 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111553.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111553.log.gz 2026-03-25T15:45:45.670 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111599.log 2026-03-25T15:45:45.670 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111576.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111576.log.gz 2026-03-25T15:45:45.671 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111622.log 2026-03-25T15:45:45.671 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111599.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111599.log.gz 2026-03-25T15:45:45.671 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111645.log 2026-03-25T15:45:45.671 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111622.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111622.log.gz 2026-03-25T15:45:45.672 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111669.log 2026-03-25T15:45:45.672 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111645.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111645.log.gz 2026-03-25T15:45:45.672 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111691.log 2026-03-25T15:45:45.673 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111669.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111669.log.gz 2026-03-25T15:45:45.673 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111712.log 2026-03-25T15:45:45.673 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111691.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111691.log.gz 2026-03-25T15:45:45.673 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111729.log 2026-03-25T15:45:45.674 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111712.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111712.log.gz 2026-03-25T15:45:45.674 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111747.log 2026-03-25T15:45:45.674 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111729.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111729.log.gz 2026-03-25T15:45:45.675 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111764.log 2026-03-25T15:45:45.675 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111747.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111747.log.gz 2026-03-25T15:45:45.675 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111782.log 2026-03-25T15:45:45.676 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111764.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111764.log.gz 2026-03-25T15:45:45.676 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111799.log 2026-03-25T15:45:45.676 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111782.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111782.log.gz 2026-03-25T15:45:45.676 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111817.log 2026-03-25T15:45:45.677 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111799.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111799.log.gz 2026-03-25T15:45:45.677 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111834.log 2026-03-25T15:45:45.677 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111817.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111817.log.gz 2026-03-25T15:45:45.678 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111852.log 2026-03-25T15:45:45.678 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111834.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111834.log.gz 2026-03-25T15:45:45.678 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111871.log 2026-03-25T15:45:45.679 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111852.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111852.log.gz 2026-03-25T15:45:45.679 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111890.log 2026-03-25T15:45:45.679 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111871.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111871.log.gz 2026-03-25T15:45:45.679 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111909.log 2026-03-25T15:45:45.680 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111890.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111890.log.gz 2026-03-25T15:45:45.680 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111927.log 2026-03-25T15:45:45.681 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111909.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111909.log.gz 2026-03-25T15:45:45.681 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111948.log 2026-03-25T15:45:45.681 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111927.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111927.log.gz 2026-03-25T15:45:45.682 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111967.log 2026-03-25T15:45:45.682 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111948.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111948.log.gz 2026-03-25T15:45:45.682 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.111986.log 2026-03-25T15:45:45.683 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111967.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111967.log.gz 2026-03-25T15:45:45.683 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112005.log 2026-03-25T15:45:45.683 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.111986.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.111986.log.gz 2026-03-25T15:45:45.683 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112024.log 2026-03-25T15:45:45.684 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112005.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112005.log.gz 2026-03-25T15:45:45.684 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112043.log 2026-03-25T15:45:45.684 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112024.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112024.log.gz 2026-03-25T15:45:45.685 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112061.log 2026-03-25T15:45:45.685 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112043.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112043.log.gz 2026-03-25T15:45:45.685 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112080.log 2026-03-25T15:45:45.686 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112061.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112061.log.gz 2026-03-25T15:45:45.686 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112101.log 2026-03-25T15:45:45.686 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112080.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112080.log.gz 2026-03-25T15:45:45.686 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112120.log 2026-03-25T15:45:45.687 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112101.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112101.log.gz 2026-03-25T15:45:45.687 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112139.log 2026-03-25T15:45:45.687 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112120.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112120.log.gz 2026-03-25T15:45:45.688 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112158.log 2026-03-25T15:45:45.688 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112139.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112139.log.gz 2026-03-25T15:45:45.688 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112177.log 2026-03-25T15:45:45.688 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112158.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112158.log.gz 2026-03-25T15:45:45.689 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112211.log 2026-03-25T15:45:45.689 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112177.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112177.log.gz 2026-03-25T15:45:45.689 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112230.log 2026-03-25T15:45:45.690 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112211.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112211.log.gz 2026-03-25T15:45:45.690 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112248.log 2026-03-25T15:45:45.690 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112230.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112230.log.gz 2026-03-25T15:45:45.690 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112266.log 2026-03-25T15:45:45.691 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112248.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112248.log.gz 2026-03-25T15:45:45.691 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112283.log 2026-03-25T15:45:45.691 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112266.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112266.log.gz 2026-03-25T15:45:45.691 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112300.log 2026-03-25T15:45:45.692 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112283.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112283.log.gz 2026-03-25T15:45:45.692 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112318.log 2026-03-25T15:45:45.693 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112300.log: 29.6% -- replaced with /var/log/ceph/ceph-client.admin.112300.log.gz 2026-03-25T15:45:45.693 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112336.log 2026-03-25T15:45:45.693 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112318.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112318.log.gz 2026-03-25T15:45:45.693 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112353.log 2026-03-25T15:45:45.694 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112336.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112336.log.gz 2026-03-25T15:45:45.694 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112371.log 2026-03-25T15:45:45.694 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112353.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112353.log.gz 2026-03-25T15:45:45.694 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112389.log 2026-03-25T15:45:45.695 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112371.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112371.log.gz 2026-03-25T15:45:45.695 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112406.log 2026-03-25T15:45:45.695 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112389.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112389.log.gz 2026-03-25T15:45:45.696 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112424.log 2026-03-25T15:45:45.696 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112406.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112406.log.gz 2026-03-25T15:45:45.696 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112442.log 2026-03-25T15:45:45.697 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112424.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112424.log.gz 2026-03-25T15:45:45.697 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112459.log 2026-03-25T15:45:45.697 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112442.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112442.log.gz 2026-03-25T15:45:45.697 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112476.log 2026-03-25T15:45:45.698 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112459.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112459.log.gz 2026-03-25T15:45:45.698 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112493.log 2026-03-25T15:45:45.698 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112476.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112476.log.gz 2026-03-25T15:45:45.699 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112510.log 2026-03-25T15:45:45.699 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112493.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112493.log.gz 2026-03-25T15:45:45.699 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112527.log 2026-03-25T15:45:45.699 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112510.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112510.log.gz 2026-03-25T15:45:45.700 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112544.log 2026-03-25T15:45:45.700 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112527.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112527.log.gz 2026-03-25T15:45:45.700 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112561.log 2026-03-25T15:45:45.701 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112544.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112544.log.gz 2026-03-25T15:45:45.701 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112578.log 2026-03-25T15:45:45.701 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112561.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112561.log.gz 2026-03-25T15:45:45.701 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112595.log 2026-03-25T15:45:45.702 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112578.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112578.log.gz 2026-03-25T15:45:45.702 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112612.log 2026-03-25T15:45:45.702 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112595.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112595.log.gz 2026-03-25T15:45:45.703 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112629.log 2026-03-25T15:45:45.703 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112612.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112612.log.gz 2026-03-25T15:45:45.703 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112651.log 2026-03-25T15:45:45.704 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112629.log: 58.1% -- replaced with /var/log/ceph/ceph-client.admin.112629.log.gz 2026-03-25T15:45:45.704 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112670.log 2026-03-25T15:45:45.704 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112651.log: 27.2% -- replaced with /var/log/ceph/ceph-client.admin.112651.log.gz 2026-03-25T15:45:45.705 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112689.log 2026-03-25T15:45:45.705 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112670.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112670.log.gz 2026-03-25T15:45:45.705 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112708.log 2026-03-25T15:45:45.706 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112689.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112689.log.gz 2026-03-25T15:45:45.706 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112727.log 2026-03-25T15:45:45.706 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112708.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112708.log.gz 2026-03-25T15:45:45.706 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112747.log 2026-03-25T15:45:45.707 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112727.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112727.log.gz 2026-03-25T15:45:45.707 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112765.log 2026-03-25T15:45:45.707 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112747.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112747.log.gz 2026-03-25T15:45:45.708 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112784.log 2026-03-25T15:45:45.708 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112765.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112765.log.gz 2026-03-25T15:45:45.708 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112801.log 2026-03-25T15:45:45.708 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112784.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112784.log.gz 2026-03-25T15:45:45.709 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112819.log 2026-03-25T15:45:45.709 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112801.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112801.log.gz 2026-03-25T15:45:45.709 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112840.log 2026-03-25T15:45:45.710 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112819.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112819.log.gz 2026-03-25T15:45:45.710 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112862.log 2026-03-25T15:45:45.710 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112840.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112840.log.gz 2026-03-25T15:45:45.710 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112883.log 2026-03-25T15:45:45.711 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112862.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112862.log.gz 2026-03-25T15:45:45.711 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112905.log 2026-03-25T15:45:45.711 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112883.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112883.log.gz 2026-03-25T15:45:45.712 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112926.log 2026-03-25T15:45:45.712 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112905.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112905.log.gz 2026-03-25T15:45:45.712 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112948.log 2026-03-25T15:45:45.713 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112926.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112926.log.gz 2026-03-25T15:45:45.713 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112969.log 2026-03-25T15:45:45.713 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112948.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112948.log.gz 2026-03-25T15:45:45.713 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.112991.log 2026-03-25T15:45:45.714 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112969.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112969.log.gz 2026-03-25T15:45:45.714 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113012.log 2026-03-25T15:45:45.714 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.112991.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.112991.log.gz 2026-03-25T15:45:45.715 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113034.log 2026-03-25T15:45:45.715 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113012.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113012.log.gz 2026-03-25T15:45:45.715 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113055.log 2026-03-25T15:45:45.716 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113034.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113034.log.gz 2026-03-25T15:45:45.716 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113077.log 2026-03-25T15:45:45.716 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113055.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113055.log.gz 2026-03-25T15:45:45.716 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113098.log 2026-03-25T15:45:45.717 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113077.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113077.log.gz 2026-03-25T15:45:45.717 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113120.log 2026-03-25T15:45:45.717 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113098.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113098.log.gz 2026-03-25T15:45:45.718 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113141.log 2026-03-25T15:45:45.718 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113120.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113120.log.gz 2026-03-25T15:45:45.718 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113163.log 2026-03-25T15:45:45.718 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113141.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113141.log.gz 2026-03-25T15:45:45.719 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113184.log 2026-03-25T15:45:45.719 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113163.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113163.log.gz 2026-03-25T15:45:45.719 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113206.log 2026-03-25T15:45:45.720 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113184.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113184.log.gz 2026-03-25T15:45:45.720 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113227.log 2026-03-25T15:45:45.720 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113206.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113206.log.gz 2026-03-25T15:45:45.721 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113249.log 2026-03-25T15:45:45.721 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113227.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113227.log.gz 2026-03-25T15:45:45.721 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113270.log 2026-03-25T15:45:45.722 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113249.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113249.log.gz 2026-03-25T15:45:45.722 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113292.log 2026-03-25T15:45:45.722 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113270.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113270.log.gz 2026-03-25T15:45:45.723 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113313.log 2026-03-25T15:45:45.723 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113292.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113292.log.gz 2026-03-25T15:45:45.723 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113335.log 2026-03-25T15:45:45.724 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113313.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113313.log.gz 2026-03-25T15:45:45.724 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113356.log 2026-03-25T15:45:45.724 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113335.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113335.log.gz 2026-03-25T15:45:45.724 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113378.log 2026-03-25T15:45:45.725 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113356.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113356.log.gz 2026-03-25T15:45:45.725 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113399.log 2026-03-25T15:45:45.725 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113378.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113378.log.gz 2026-03-25T15:45:45.726 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113421.log 2026-03-25T15:45:45.726 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113399.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113399.log.gz 2026-03-25T15:45:45.726 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113442.log 2026-03-25T15:45:45.727 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113421.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113421.log.gz 2026-03-25T15:45:45.727 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113464.log 2026-03-25T15:45:45.727 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113442.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113442.log.gz 2026-03-25T15:45:45.727 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113485.log 2026-03-25T15:45:45.728 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113464.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113464.log.gz 2026-03-25T15:45:45.728 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113507.log 2026-03-25T15:45:45.728 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113485.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113485.log.gz 2026-03-25T15:45:45.729 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113528.log 2026-03-25T15:45:45.729 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113507.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113507.log.gz 2026-03-25T15:45:45.729 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113550.log 2026-03-25T15:45:45.729 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113528.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113528.log.gz 2026-03-25T15:45:45.730 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113571.log 2026-03-25T15:45:45.730 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113550.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113550.log.gz 2026-03-25T15:45:45.730 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113592.log 2026-03-25T15:45:45.731 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113571.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113571.log.gz 2026-03-25T15:45:45.731 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113617.log 2026-03-25T15:45:45.731 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113592.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113592.log.gz 2026-03-25T15:45:45.731 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113638.log 2026-03-25T15:45:45.732 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113617.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113617.log.gz 2026-03-25T15:45:45.732 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113660.log 2026-03-25T15:45:45.732 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113638.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113638.log.gz 2026-03-25T15:45:45.733 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113681.log 2026-03-25T15:45:45.733 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113660.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113660.log.gz 2026-03-25T15:45:45.733 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113703.log 2026-03-25T15:45:45.733 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113681.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113681.log.gz 2026-03-25T15:45:45.734 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113724.log 2026-03-25T15:45:45.734 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113703.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113703.log.gz 2026-03-25T15:45:45.734 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113746.log 2026-03-25T15:45:45.735 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113724.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113724.log.gz 2026-03-25T15:45:45.735 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113767.log 2026-03-25T15:45:45.735 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113746.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113746.log.gz 2026-03-25T15:45:45.736 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113789.log 2026-03-25T15:45:45.736 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113767.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113767.log.gz 2026-03-25T15:45:45.736 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113810.log 2026-03-25T15:45:45.736 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113789.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113789.log.gz 2026-03-25T15:45:45.737 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113832.log 2026-03-25T15:45:45.737 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113810.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113810.log.gz 2026-03-25T15:45:45.737 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113853.log 2026-03-25T15:45:45.738 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113832.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113832.log.gz 2026-03-25T15:45:45.738 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113875.log 2026-03-25T15:45:45.738 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113853.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113853.log.gz 2026-03-25T15:45:45.738 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113896.log 2026-03-25T15:45:45.739 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113875.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113875.log.gz 2026-03-25T15:45:45.739 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113918.log 2026-03-25T15:45:45.739 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113896.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113896.log.gz 2026-03-25T15:45:45.740 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113939.log 2026-03-25T15:45:45.740 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113918.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113918.log.gz 2026-03-25T15:45:45.740 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113961.log 2026-03-25T15:45:45.740 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113939.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113939.log.gz 2026-03-25T15:45:45.741 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.113982.log 2026-03-25T15:45:45.741 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113961.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113961.log.gz 2026-03-25T15:45:45.741 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114004.log 2026-03-25T15:45:45.742 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.113982.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.113982.log.gz 2026-03-25T15:45:45.742 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114025.log 2026-03-25T15:45:45.743 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114004.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114004.log.gz 2026-03-25T15:45:45.743 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114047.log 2026-03-25T15:45:45.743 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114025.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114025.log.gz 2026-03-25T15:45:45.743 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114068.log 2026-03-25T15:45:45.744 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114047.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114047.log.gz 2026-03-25T15:45:45.744 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114090.log 2026-03-25T15:45:45.745 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114068.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114068.log.gz 2026-03-25T15:45:45.745 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114111.log 2026-03-25T15:45:45.745 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114090.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114090.log.gz 2026-03-25T15:45:45.745 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114133.log 2026-03-25T15:45:45.746 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114111.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114111.log.gz 2026-03-25T15:45:45.746 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114154.log 2026-03-25T15:45:45.747 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114133.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114133.log.gz 2026-03-25T15:45:45.747 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114176.log 2026-03-25T15:45:45.747 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114154.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114154.log.gz 2026-03-25T15:45:45.747 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114197.log 2026-03-25T15:45:45.748 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114176.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114176.log.gz 2026-03-25T15:45:45.748 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114219.log 2026-03-25T15:45:45.748 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114197.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114197.log.gz 2026-03-25T15:45:45.749 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114240.log 2026-03-25T15:45:45.749 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114219.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114219.log.gz 2026-03-25T15:45:45.749 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114262.log 2026-03-25T15:45:45.749 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114240.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114240.log.gz 2026-03-25T15:45:45.750 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114283.log 2026-03-25T15:45:45.750 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114262.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114262.log.gz 2026-03-25T15:45:45.750 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114305.log 2026-03-25T15:45:45.751 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114283.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114283.log.gz 2026-03-25T15:45:45.751 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114326.log 2026-03-25T15:45:45.751 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114305.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114305.log.gz 2026-03-25T15:45:45.751 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114348.log 2026-03-25T15:45:45.752 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114326.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114326.log.gz 2026-03-25T15:45:45.752 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114369.log 2026-03-25T15:45:45.752 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114348.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114348.log.gz 2026-03-25T15:45:45.753 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114390.log 2026-03-25T15:45:45.753 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114369.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114369.log.gz 2026-03-25T15:45:45.753 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114416.log 2026-03-25T15:45:45.753 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114390.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114390.log.gz 2026-03-25T15:45:45.754 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114433.log 2026-03-25T15:45:45.754 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114416.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114416.log.gz 2026-03-25T15:45:45.754 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114450.log 2026-03-25T15:45:45.755 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114433.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114433.log.gz 2026-03-25T15:45:45.755 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114467.log 2026-03-25T15:45:45.755 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114450.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114450.log.gz 2026-03-25T15:45:45.755 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114484.log 2026-03-25T15:45:45.756 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114467.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114467.log.gz 2026-03-25T15:45:45.756 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114501.log 2026-03-25T15:45:45.756 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114484.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114484.log.gz 2026-03-25T15:45:45.757 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114518.log 2026-03-25T15:45:45.757 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114501.log: 29.0% -- replaced with /var/log/ceph/ceph-client.admin.114501.log.gz 2026-03-25T15:45:45.757 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114539.log 2026-03-25T15:45:45.757 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114518.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114518.log.gz 2026-03-25T15:45:45.758 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114560.log 2026-03-25T15:45:45.758 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114539.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114539.log.gz 2026-03-25T15:45:45.758 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114577.log 2026-03-25T15:45:45.759 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114560.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114560.log.gz 2026-03-25T15:45:45.759 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114595.log 2026-03-25T15:45:45.759 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114577.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114577.log.gz 2026-03-25T15:45:45.759 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114622.log 2026-03-25T15:45:45.760 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114595.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114595.log.gz 2026-03-25T15:45:45.760 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114646.log 2026-03-25T15:45:45.760 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114622.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114622.log.gz 2026-03-25T15:45:45.760 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114665.log 2026-03-25T15:45:45.761 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114646.log: 30.5% -- replaced with /var/log/ceph/ceph-client.admin.114646.log.gz 2026-03-25T15:45:45.761 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114683.log 2026-03-25T15:45:45.761 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114665.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114665.log.gz 2026-03-25T15:45:45.762 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114701.log 2026-03-25T15:45:45.762 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114683.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114683.log.gz 2026-03-25T15:45:45.762 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114719.log 2026-03-25T15:45:45.763 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114701.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114701.log.gz 2026-03-25T15:45:45.763 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114737.log 2026-03-25T15:45:45.763 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114719.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114719.log.gz 2026-03-25T15:45:45.763 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114769.log 2026-03-25T15:45:45.764 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114737.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114737.log.gz 2026-03-25T15:45:45.764 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114786.log 2026-03-25T15:45:45.764 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114769.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114769.log.gz 2026-03-25T15:45:45.765 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114804.log 2026-03-25T15:45:45.765 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114786.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114786.log.gz 2026-03-25T15:45:45.765 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114822.log 2026-03-25T15:45:45.766 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114804.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114804.log.gz 2026-03-25T15:45:45.766 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114839.log 2026-03-25T15:45:45.766 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114822.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114822.log.gz 2026-03-25T15:45:45.766 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114856.log 2026-03-25T15:45:45.767 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114839.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114839.log.gz 2026-03-25T15:45:45.767 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114875.log 2026-03-25T15:45:45.767 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114856.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114856.log.gz 2026-03-25T15:45:45.768 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114894.log 2026-03-25T15:45:45.768 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114875.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114875.log.gz 2026-03-25T15:45:45.768 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114914.log 2026-03-25T15:45:45.769 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114894.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114894.log.gz 2026-03-25T15:45:45.769 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114935.log 2026-03-25T15:45:45.769 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114914.log: 53.9% -- replaced with /var/log/ceph/ceph-client.admin.114914.log.gz 2026-03-25T15:45:45.769 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114960.log 2026-03-25T15:45:45.770 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114935.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114935.log.gz 2026-03-25T15:45:45.770 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.114981.log 2026-03-25T15:45:45.770 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114960.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114960.log.gz 2026-03-25T15:45:45.770 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115003.log 2026-03-25T15:45:45.771 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.114981.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.114981.log.gz 2026-03-25T15:45:45.771 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115024.log 2026-03-25T15:45:45.771 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115003.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115003.log.gz 2026-03-25T15:45:45.772 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115046.log 2026-03-25T15:45:45.772 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115024.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115024.log.gz 2026-03-25T15:45:45.772 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115067.log 2026-03-25T15:45:45.772 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115046.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115046.log.gz 2026-03-25T15:45:45.773 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115089.log 2026-03-25T15:45:45.773 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115067.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115067.log.gz 2026-03-25T15:45:45.773 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115110.log 2026-03-25T15:45:45.774 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115089.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115089.log.gz 2026-03-25T15:45:45.774 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115132.log 2026-03-25T15:45:45.774 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115110.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115110.log.gz 2026-03-25T15:45:45.774 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115153.log 2026-03-25T15:45:45.775 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115132.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115132.log.gz 2026-03-25T15:45:45.775 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115175.log 2026-03-25T15:45:45.775 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115153.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115153.log.gz 2026-03-25T15:45:45.775 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115196.log 2026-03-25T15:45:45.776 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115175.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115175.log.gz 2026-03-25T15:45:45.776 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115218.log 2026-03-25T15:45:45.776 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115196.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115196.log.gz 2026-03-25T15:45:45.776 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115239.log 2026-03-25T15:45:45.777 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115218.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115218.log.gz 2026-03-25T15:45:45.777 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115261.log 2026-03-25T15:45:45.777 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115239.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115239.log.gz 2026-03-25T15:45:45.778 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115282.log 2026-03-25T15:45:45.778 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115261.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115261.log.gz 2026-03-25T15:45:45.778 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115304.log 2026-03-25T15:45:45.778 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115282.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115282.log.gz 2026-03-25T15:45:45.779 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115325.log 2026-03-25T15:45:45.779 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115304.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115304.log.gz 2026-03-25T15:45:45.779 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115347.log 2026-03-25T15:45:45.780 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115325.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115325.log.gz 2026-03-25T15:45:45.780 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115368.log 2026-03-25T15:45:45.780 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115347.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115347.log.gz 2026-03-25T15:45:45.780 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115390.log 2026-03-25T15:45:45.781 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115368.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115368.log.gz 2026-03-25T15:45:45.781 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115411.log 2026-03-25T15:45:45.781 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115390.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115390.log.gz 2026-03-25T15:45:45.781 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115433.log 2026-03-25T15:45:45.782 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115411.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115411.log.gz 2026-03-25T15:45:45.782 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115454.log 2026-03-25T15:45:45.782 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115433.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115433.log.gz 2026-03-25T15:45:45.782 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115476.log 2026-03-25T15:45:45.783 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115454.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115454.log.gz 2026-03-25T15:45:45.783 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115497.log 2026-03-25T15:45:45.784 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115476.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115476.log.gz 2026-03-25T15:45:45.784 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115519.log 2026-03-25T15:45:45.784 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115497.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115497.log.gz 2026-03-25T15:45:45.784 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115540.log 2026-03-25T15:45:45.785 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115519.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115519.log.gz 2026-03-25T15:45:45.785 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115562.log 2026-03-25T15:45:45.785 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115540.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115540.log.gz 2026-03-25T15:45:45.785 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115583.log 2026-03-25T15:45:45.786 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115562.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115562.log.gz 2026-03-25T15:45:45.786 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115605.log 2026-03-25T15:45:45.786 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115583.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115583.log.gz 2026-03-25T15:45:45.786 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115626.log 2026-03-25T15:45:45.787 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115605.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115605.log.gz 2026-03-25T15:45:45.787 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115648.log 2026-03-25T15:45:45.787 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115626.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115626.log.gz 2026-03-25T15:45:45.788 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115669.log 2026-03-25T15:45:45.788 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115648.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115648.log.gz 2026-03-25T15:45:45.788 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115691.log 2026-03-25T15:45:45.789 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115669.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115669.log.gz 2026-03-25T15:45:45.789 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115712.log 2026-03-25T15:45:45.789 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115691.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115691.log.gz 2026-03-25T15:45:45.789 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115733.log 2026-03-25T15:45:45.790 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115712.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115712.log.gz 2026-03-25T15:45:45.790 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115759.log 2026-03-25T15:45:45.790 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115733.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115733.log.gz 2026-03-25T15:45:45.791 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115776.log 2026-03-25T15:45:45.791 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115759.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115759.log.gz 2026-03-25T15:45:45.791 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115793.log 2026-03-25T15:45:45.791 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115776.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115776.log.gz 2026-03-25T15:45:45.792 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115819.log 2026-03-25T15:45:45.792 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115793.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115793.log.gz 2026-03-25T15:45:45.792 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115837.log 2026-03-25T15:45:45.793 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115819.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115819.log.gz 2026-03-25T15:45:45.793 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115854.log 2026-03-25T15:45:45.793 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115837.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115837.log.gz 2026-03-25T15:45:45.794 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115871.log 2026-03-25T15:45:45.794 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115854.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115854.log.gz 2026-03-25T15:45:45.794 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115888.log 2026-03-25T15:45:45.794 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115871.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115871.log.gz 2026-03-25T15:45:45.795 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115905.log 2026-03-25T15:45:45.795 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115888.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115888.log.gz 2026-03-25T15:45:45.795 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115922.log 2026-03-25T15:45:45.795 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115905.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115905.log.gz 2026-03-25T15:45:45.796 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115939.log 2026-03-25T15:45:45.796 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115922.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115922.log.gz 2026-03-25T15:45:45.796 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115941.log 2026-03-25T15:45:45.797 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115939.log: 86.1% -- replaced with /var/log/ceph/ceph-client.admin.115939.log.gz 2026-03-25T15:45:45.797 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115940.log 2026-03-25T15:45:45.797 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115941.log: 87.3% -- replaced with /var/log/ceph/ceph-client.admin.115941.log.gz 2026-03-25T15:45:45.797 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115943.log 2026-03-25T15:45:45.798 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115940.log: 86.4% -- replaced with /var/log/ceph/ceph-client.admin.115940.log.gz 2026-03-25T15:45:45.798 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115945.log 2026-03-25T15:45:45.798 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115943.log: 86.6% -- replaced with /var/log/ceph/ceph-client.admin.115943.log.gz 2026-03-25T15:45:45.799 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.115942.log 2026-03-25T15:45:45.799 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115945.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.115945.log.gz 2026-03-25T15:45:45.799 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.116064.log 2026-03-25T15:45:45.800 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.115942.log: 86.9% -- replaced with /var/log/ceph/ceph-client.admin.115942.log.gz 2026-03-25T15:45:45.800 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.116083.log 2026-03-25T15:45:45.801 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.116064.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.116064.log.gz 2026-03-25T15:45:45.801 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.116102.log 2026-03-25T15:45:45.801 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.116083.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.116083.log.gz 2026-03-25T15:45:45.802 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.116121.log 2026-03-25T15:45:45.802 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.116102.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.116102.log.gz 2026-03-25T15:45:45.802 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.116140.log 2026-03-25T15:45:45.803 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.116121.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.116121.log.gz 2026-03-25T15:45:45.803 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.116159.log 2026-03-25T15:45:45.803 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.116140.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.116140.log.gz 2026-03-25T15:45:45.803 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.116179.log 2026-03-25T15:45:45.804 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.116159.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.116159.log.gz 2026-03-25T15:45:45.804 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.116200.log 2026-03-25T15:45:45.805 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.116179.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.116179.log.gz 2026-03-25T15:45:45.805 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.116222.log 2026-03-25T15:45:45.805 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.116200.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.116200.log.gz 2026-03-25T15:45:45.806 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.116243.log 2026-03-25T15:45:45.806 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.116222.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.116222.log.gz 2026-03-25T15:45:45.806 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.116265.log 2026-03-25T15:45:45.807 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.116243.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.116243.log.gz 2026-03-25T15:45:45.807 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.116286.log 2026-03-25T15:45:45.807 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.116265.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.116265.log.gz 2026-03-25T15:45:45.807 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.116308.log 2026-03-25T15:45:45.808 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.116286.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.116286.log.gz 2026-03-25T15:45:45.808 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.116329.log 2026-03-25T15:45:45.808 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.116308.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.116308.log.gz 2026-03-25T15:45:45.809 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.116351.log 2026-03-25T15:45:45.809 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.116329.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.116329.log.gz 2026-03-25T15:45:45.809 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.116372.log 2026-03-25T15:45:45.809 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.116351.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.116351.log.gz 2026-03-25T15:45:45.810 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.116394.log 2026-03-25T15:45:45.810 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.116372.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.116372.log.gz 2026-03-25T15:45:45.810 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.116415.log 2026-03-25T15:45:45.811 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.116394.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.116394.log.gz 2026-03-25T15:45:45.811 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.116437.log 2026-03-25T15:45:45.811 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.116415.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.116415.log.gz 2026-03-25T15:45:45.811 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.116458.log 2026-03-25T15:45:45.812 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.116437.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.116437.log.gz 2026-03-25T15:45:45.812 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.116480.log 2026-03-25T15:45:45.812 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.116458.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.116458.log.gz 2026-03-25T15:45:45.813 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.116501.log 2026-03-25T15:45:45.813 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.116480.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.116480.log.gz 2026-03-25T15:45:45.813 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.116523.log 2026-03-25T15:45:45.814 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.116501.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.116501.log.gz 2026-03-25T15:45:45.814 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.116544.log 2026-03-25T15:45:45.814 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.116523.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.116523.log.gz 2026-03-25T15:45:45.814 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.116566.log 2026-03-25T15:45:45.815 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.116544.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.116544.log.gz 2026-03-25T15:45:45.815 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.116587.log 2026-03-25T15:45:45.815 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.116566.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.116566.log.gz 2026-03-25T15:45:45.816 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.116609.log 2026-03-25T15:45:45.816 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.116587.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.116587.log.gz 2026-03-25T15:45:45.816 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.116630.log 2026-03-25T15:45:45.816 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.116609.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.116609.log.gz 2026-03-25T15:45:45.817 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.116652.log 2026-03-25T15:45:45.817 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.116630.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.116630.log.gz 2026-03-25T15:45:45.817 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.116673.log 2026-03-25T15:45:45.818 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.116652.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.116652.log.gz 2026-03-25T15:45:45.818 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.116695.log 2026-03-25T15:45:45.818 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.116673.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.116673.log.gz 2026-03-25T15:45:45.818 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.116716.log 2026-03-25T15:45:45.819 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.116695.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.116695.log.gz 2026-03-25T15:45:45.819 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.116738.log 2026-03-25T15:45:45.819 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.116716.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.116716.log.gz 2026-03-25T15:45:45.820 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.116759.log 2026-03-25T15:45:45.820 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.116738.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.116738.log.gz 2026-03-25T15:45:45.820 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.116781.log 2026-03-25T15:45:45.821 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.116759.log: 58.1% -- replaced with /var/log/ceph/ceph-client.admin.116759.log.gz 2026-03-25T15:45:45.821 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.116802.log 2026-03-25T15:45:45.821 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.116781.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.116781.log.gz 2026-03-25T15:45:45.821 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.116824.log 2026-03-25T15:45:45.822 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.116802.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.116802.log.gz 2026-03-25T15:45:45.822 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.116845.log 2026-03-25T15:45:45.822 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.116824.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.116824.log.gz 2026-03-25T15:45:45.823 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.116867.log 2026-03-25T15:45:45.823 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.116845.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.116845.log.gz 2026-03-25T15:45:45.823 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.116888.log 2026-03-25T15:45:45.823 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.116867.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.116867.log.gz 2026-03-25T15:45:45.824 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.116910.log 2026-03-25T15:45:45.825 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.116888.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.116888.log.gz 2026-03-25T15:45:45.825 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.116931.log 2026-03-25T15:45:45.825 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.116910.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.116910.log.gz 2026-03-25T15:45:45.826 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.116952.log 2026-03-25T15:45:45.826 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.116931.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.116931.log.gz 2026-03-25T15:45:45.826 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.116976.log 2026-03-25T15:45:45.827 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.116952.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.116952.log.gz 2026-03-25T15:45:45.827 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117001.log 2026-03-25T15:45:45.827 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.116976.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.116976.log.gz 2026-03-25T15:45:45.828 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117022.log 2026-03-25T15:45:45.828 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117001.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.117001.log.gz 2026-03-25T15:45:45.828 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117044.log 2026-03-25T15:45:45.829 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117022.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.117022.log.gz 2026-03-25T15:45:45.829 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117065.log 2026-03-25T15:45:45.829 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117044.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.117044.log.gz 2026-03-25T15:45:45.829 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117087.log 2026-03-25T15:45:45.830 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117065.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.117065.log.gz 2026-03-25T15:45:45.830 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117108.log 2026-03-25T15:45:45.830 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117087.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.117087.log.gz 2026-03-25T15:45:45.831 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117130.log 2026-03-25T15:45:45.831 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117108.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.117108.log.gz 2026-03-25T15:45:45.831 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117151.log 2026-03-25T15:45:45.832 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117130.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.117130.log.gz 2026-03-25T15:45:45.832 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117173.log 2026-03-25T15:45:45.832 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117151.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.117151.log.gz 2026-03-25T15:45:45.833 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117194.log 2026-03-25T15:45:45.833 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117173.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.117173.log.gz 2026-03-25T15:45:45.833 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117216.log 2026-03-25T15:45:45.834 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117194.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.117194.log.gz 2026-03-25T15:45:45.834 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117237.log 2026-03-25T15:45:45.835 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117216.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.117216.log.gz 2026-03-25T15:45:45.835 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117259.log 2026-03-25T15:45:45.835 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117237.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.117237.log.gz 2026-03-25T15:45:45.836 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117280.log 2026-03-25T15:45:45.836 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117259.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.117259.log.gz 2026-03-25T15:45:45.836 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117302.log 2026-03-25T15:45:45.837 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117280.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.117280.log.gz 2026-03-25T15:45:45.837 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117323.log 2026-03-25T15:45:45.838 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117302.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.117302.log.gz 2026-03-25T15:45:45.838 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117345.log 2026-03-25T15:45:45.838 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117323.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.117323.log.gz 2026-03-25T15:45:45.839 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117366.log 2026-03-25T15:45:45.839 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117345.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.117345.log.gz 2026-03-25T15:45:45.839 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117388.log 2026-03-25T15:45:45.840 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117366.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.117366.log.gz 2026-03-25T15:45:45.840 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117409.log 2026-03-25T15:45:45.840 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117388.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.117388.log.gz 2026-03-25T15:45:45.840 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117431.log 2026-03-25T15:45:45.841 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117409.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.117409.log.gz 2026-03-25T15:45:45.841 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117452.log 2026-03-25T15:45:45.841 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117431.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.117431.log.gz 2026-03-25T15:45:45.842 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117474.log 2026-03-25T15:45:45.842 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117452.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.117452.log.gz 2026-03-25T15:45:45.842 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117495.log 2026-03-25T15:45:45.843 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117474.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.117474.log.gz 2026-03-25T15:45:45.843 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117517.log 2026-03-25T15:45:45.843 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117495.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.117495.log.gz 2026-03-25T15:45:45.844 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117538.log 2026-03-25T15:45:45.844 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117517.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.117517.log.gz 2026-03-25T15:45:45.845 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117560.log 2026-03-25T15:45:45.845 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117538.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.117538.log.gz 2026-03-25T15:45:45.845 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117581.log 2026-03-25T15:45:45.846 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117560.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.117560.log.gz 2026-03-25T15:45:45.846 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117603.log 2026-03-25T15:45:45.846 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117581.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.117581.log.gz 2026-03-25T15:45:45.847 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117624.log 2026-03-25T15:45:45.847 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117603.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.117603.log.gz 2026-03-25T15:45:45.847 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117646.log 2026-03-25T15:45:45.848 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117624.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.117624.log.gz 2026-03-25T15:45:45.848 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117667.log 2026-03-25T15:45:45.848 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117646.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.117646.log.gz 2026-03-25T15:45:45.849 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117689.log 2026-03-25T15:45:45.849 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117667.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.117667.log.gz 2026-03-25T15:45:45.849 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117710.log 2026-03-25T15:45:45.850 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117689.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.117689.log.gz 2026-03-25T15:45:45.850 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117732.log 2026-03-25T15:45:45.851 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117710.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.117710.log.gz 2026-03-25T15:45:45.851 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117753.log 2026-03-25T15:45:45.851 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117732.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.117732.log.gz 2026-03-25T15:45:45.851 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117774.log 2026-03-25T15:45:45.852 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117753.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.117753.log.gz 2026-03-25T15:45:45.852 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117800.log 2026-03-25T15:45:45.852 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117774.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.117774.log.gz 2026-03-25T15:45:45.853 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117817.log 2026-03-25T15:45:45.853 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117800.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.117800.log.gz 2026-03-25T15:45:45.853 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117834.log 2026-03-25T15:45:45.854 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117817.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.117817.log.gz 2026-03-25T15:45:45.854 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117851.log 2026-03-25T15:45:45.854 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117834.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.117834.log.gz 2026-03-25T15:45:45.854 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117868.log 2026-03-25T15:45:45.855 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117851.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.117851.log.gz 2026-03-25T15:45:45.855 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117869.log 2026-03-25T15:45:45.855 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117868.log: 79.1% -- replaced with /var/log/ceph/ceph-client.admin.117868.log.gz 2026-03-25T15:45:45.856 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117871.log 2026-03-25T15:45:45.856 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117869.log: 84.6% -- replaced with /var/log/ceph/ceph-client.admin.117869.log.gz 2026-03-25T15:45:45.856 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117930.log 2026-03-25T15:45:45.857 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117871.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.117871.log.gz 2026-03-25T15:45:45.857 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117957.log 2026-03-25T15:45:45.857 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117930.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.117930.log.gz 2026-03-25T15:45:45.858 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.117981.log 2026-03-25T15:45:45.858 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117957.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.117957.log.gz 2026-03-25T15:45:45.858 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118015.log 2026-03-25T15:45:45.859 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.117981.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.117981.log.gz 2026-03-25T15:45:45.859 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118034.log 2026-03-25T15:45:45.859 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118015.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118015.log.gz 2026-03-25T15:45:45.859 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118055.log 2026-03-25T15:45:45.860 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118034.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118034.log.gz 2026-03-25T15:45:45.860 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118077.log 2026-03-25T15:45:45.860 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118055.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118055.log.gz 2026-03-25T15:45:45.861 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118098.log 2026-03-25T15:45:45.861 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118077.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118077.log.gz 2026-03-25T15:45:45.861 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118120.log 2026-03-25T15:45:45.862 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118098.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118098.log.gz 2026-03-25T15:45:45.862 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118141.log 2026-03-25T15:45:45.862 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118120.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118120.log.gz 2026-03-25T15:45:45.862 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118163.log 2026-03-25T15:45:45.863 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118141.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118141.log.gz 2026-03-25T15:45:45.863 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118184.log 2026-03-25T15:45:45.863 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118163.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118163.log.gz 2026-03-25T15:45:45.864 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118206.log 2026-03-25T15:45:45.864 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118184.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118184.log.gz 2026-03-25T15:45:45.864 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118227.log 2026-03-25T15:45:45.865 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118206.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118206.log.gz 2026-03-25T15:45:45.865 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118249.log 2026-03-25T15:45:45.865 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118227.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118227.log.gz 2026-03-25T15:45:45.865 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118270.log 2026-03-25T15:45:45.866 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118249.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118249.log.gz 2026-03-25T15:45:45.866 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118292.log 2026-03-25T15:45:45.866 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118270.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118270.log.gz 2026-03-25T15:45:45.867 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118313.log 2026-03-25T15:45:45.867 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118292.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118292.log.gz 2026-03-25T15:45:45.867 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118335.log 2026-03-25T15:45:45.868 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118313.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118313.log.gz 2026-03-25T15:45:45.868 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118356.log 2026-03-25T15:45:45.868 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118335.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118335.log.gz 2026-03-25T15:45:45.868 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118378.log 2026-03-25T15:45:45.869 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118356.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118356.log.gz 2026-03-25T15:45:45.869 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118399.log 2026-03-25T15:45:45.869 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118378.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118378.log.gz 2026-03-25T15:45:45.870 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118421.log 2026-03-25T15:45:45.870 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118399.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118399.log.gz 2026-03-25T15:45:45.870 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118442.log 2026-03-25T15:45:45.871 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118421.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118421.log.gz 2026-03-25T15:45:45.871 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118464.log 2026-03-25T15:45:45.871 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118442.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118442.log.gz 2026-03-25T15:45:45.871 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118485.log 2026-03-25T15:45:45.872 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118464.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118464.log.gz 2026-03-25T15:45:45.872 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118507.log 2026-03-25T15:45:45.872 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118485.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118485.log.gz 2026-03-25T15:45:45.873 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118528.log 2026-03-25T15:45:45.873 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118507.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118507.log.gz 2026-03-25T15:45:45.873 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118550.log 2026-03-25T15:45:45.874 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118528.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118528.log.gz 2026-03-25T15:45:45.874 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118571.log 2026-03-25T15:45:45.874 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118550.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118550.log.gz 2026-03-25T15:45:45.874 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118593.log 2026-03-25T15:45:45.875 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118571.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118571.log.gz 2026-03-25T15:45:45.875 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118614.log 2026-03-25T15:45:45.875 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118593.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118593.log.gz 2026-03-25T15:45:45.876 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118636.log 2026-03-25T15:45:45.876 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118614.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118614.log.gz 2026-03-25T15:45:45.876 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118657.log 2026-03-25T15:45:45.876 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118636.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118636.log.gz 2026-03-25T15:45:45.877 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118679.log 2026-03-25T15:45:45.877 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118657.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118657.log.gz 2026-03-25T15:45:45.877 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118700.log 2026-03-25T15:45:45.878 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118679.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118679.log.gz 2026-03-25T15:45:45.878 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118722.log 2026-03-25T15:45:45.878 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118700.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118700.log.gz 2026-03-25T15:45:45.879 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118743.log 2026-03-25T15:45:45.879 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118722.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118722.log.gz 2026-03-25T15:45:45.879 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118765.log 2026-03-25T15:45:45.879 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118743.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118743.log.gz 2026-03-25T15:45:45.880 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118786.log 2026-03-25T15:45:45.880 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118765.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118765.log.gz 2026-03-25T15:45:45.880 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118807.log 2026-03-25T15:45:45.881 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118786.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118786.log.gz 2026-03-25T15:45:45.881 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118832.log 2026-03-25T15:45:45.881 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118807.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118807.log.gz 2026-03-25T15:45:45.881 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118853.log 2026-03-25T15:45:45.882 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118832.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118832.log.gz 2026-03-25T15:45:45.882 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118875.log 2026-03-25T15:45:45.883 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118853.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118853.log.gz 2026-03-25T15:45:45.883 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118896.log 2026-03-25T15:45:45.884 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118875.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118875.log.gz 2026-03-25T15:45:45.884 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118918.log 2026-03-25T15:45:45.885 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118896.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118896.log.gz 2026-03-25T15:45:45.885 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118939.log 2026-03-25T15:45:45.885 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118918.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118918.log.gz 2026-03-25T15:45:45.886 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118961.log 2026-03-25T15:45:45.886 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118939.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118939.log.gz 2026-03-25T15:45:45.886 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.118982.log 2026-03-25T15:45:45.887 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118961.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118961.log.gz 2026-03-25T15:45:45.887 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119004.log 2026-03-25T15:45:45.888 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.118982.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.118982.log.gz 2026-03-25T15:45:45.888 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119025.log 2026-03-25T15:45:45.888 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119004.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119004.log.gz 2026-03-25T15:45:45.889 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119047.log 2026-03-25T15:45:45.889 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119025.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119025.log.gz 2026-03-25T15:45:45.889 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119068.log 2026-03-25T15:45:45.890 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119047.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119047.log.gz 2026-03-25T15:45:45.890 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119090.log 2026-03-25T15:45:45.891 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119068.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119068.log.gz 2026-03-25T15:45:45.891 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119111.log 2026-03-25T15:45:45.891 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119090.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119090.log.gz 2026-03-25T15:45:45.892 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119133.log 2026-03-25T15:45:45.892 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119111.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119111.log.gz 2026-03-25T15:45:45.892 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119154.log 2026-03-25T15:45:45.893 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119133.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119133.log.gz 2026-03-25T15:45:45.893 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119176.log 2026-03-25T15:45:45.893 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119154.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119154.log.gz 2026-03-25T15:45:45.893 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119197.log 2026-03-25T15:45:45.894 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119176.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119176.log.gz 2026-03-25T15:45:45.894 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119219.log 2026-03-25T15:45:45.894 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119197.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119197.log.gz 2026-03-25T15:45:45.894 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119240.log 2026-03-25T15:45:45.895 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119219.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119219.log.gz 2026-03-25T15:45:45.895 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119262.log 2026-03-25T15:45:45.895 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119240.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119240.log.gz 2026-03-25T15:45:45.896 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119283.log 2026-03-25T15:45:45.896 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119262.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119262.log.gz 2026-03-25T15:45:45.896 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119305.log 2026-03-25T15:45:45.896 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119283.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119283.log.gz 2026-03-25T15:45:45.897 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119326.log 2026-03-25T15:45:45.897 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119305.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119305.log.gz 2026-03-25T15:45:45.897 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119348.log 2026-03-25T15:45:45.898 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119326.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119326.log.gz 2026-03-25T15:45:45.898 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119369.log 2026-03-25T15:45:45.898 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119348.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119348.log.gz 2026-03-25T15:45:45.898 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119391.log 2026-03-25T15:45:45.899 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119369.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119369.log.gz 2026-03-25T15:45:45.899 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119412.log 2026-03-25T15:45:45.899 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119391.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119391.log.gz 2026-03-25T15:45:45.899 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119434.log 2026-03-25T15:45:45.900 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119412.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119412.log.gz 2026-03-25T15:45:45.900 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119455.log 2026-03-25T15:45:45.900 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119434.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119434.log.gz 2026-03-25T15:45:45.901 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119477.log 2026-03-25T15:45:45.901 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119455.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119455.log.gz 2026-03-25T15:45:45.901 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119498.log 2026-03-25T15:45:45.902 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119477.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119477.log.gz 2026-03-25T15:45:45.902 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119520.log 2026-03-25T15:45:45.902 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119498.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119498.log.gz 2026-03-25T15:45:45.902 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119541.log 2026-03-25T15:45:45.903 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119520.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119520.log.gz 2026-03-25T15:45:45.903 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119563.log 2026-03-25T15:45:45.904 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119541.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119541.log.gz 2026-03-25T15:45:45.904 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119584.log 2026-03-25T15:45:45.904 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119563.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119563.log.gz 2026-03-25T15:45:45.904 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119605.log 2026-03-25T15:45:45.905 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119584.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119584.log.gz 2026-03-25T15:45:45.905 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119631.log 2026-03-25T15:45:45.906 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119605.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119605.log.gz 2026-03-25T15:45:45.906 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119648.log 2026-03-25T15:45:45.906 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119631.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119631.log.gz 2026-03-25T15:45:45.906 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119665.log 2026-03-25T15:45:45.907 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119648.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119648.log.gz 2026-03-25T15:45:45.907 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119691.log 2026-03-25T15:45:45.907 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119665.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119665.log.gz 2026-03-25T15:45:45.908 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119708.log 2026-03-25T15:45:45.908 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119691.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119691.log.gz 2026-03-25T15:45:45.908 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119726.log 2026-03-25T15:45:45.909 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119708.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119708.log.gz 2026-03-25T15:45:45.909 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119752.log 2026-03-25T15:45:45.909 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119726.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119726.log.gz 2026-03-25T15:45:45.910 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119778.log 2026-03-25T15:45:45.910 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119752.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119752.log.gz 2026-03-25T15:45:45.910 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119802.log 2026-03-25T15:45:45.911 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119778.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119778.log.gz 2026-03-25T15:45:45.911 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119829.log 2026-03-25T15:45:45.911 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119802.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119802.log.gz 2026-03-25T15:45:45.912 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119848.log 2026-03-25T15:45:45.912 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119829.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119829.log.gz 2026-03-25T15:45:45.912 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119867.log 2026-03-25T15:45:45.913 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119848.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119848.log.gz 2026-03-25T15:45:45.913 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119886.log 2026-03-25T15:45:45.913 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119867.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119867.log.gz 2026-03-25T15:45:45.914 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119904.log 2026-03-25T15:45:45.914 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119886.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119886.log.gz 2026-03-25T15:45:45.914 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119928.log 2026-03-25T15:45:45.915 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119904.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119904.log.gz 2026-03-25T15:45:45.915 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119953.log 2026-03-25T15:45:45.915 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119928.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119928.log.gz 2026-03-25T15:45:45.916 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119974.log 2026-03-25T15:45:45.916 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119953.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119953.log.gz 2026-03-25T15:45:45.916 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.119996.log 2026-03-25T15:45:45.916 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119974.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119974.log.gz 2026-03-25T15:45:45.917 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120017.log 2026-03-25T15:45:45.917 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.119996.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.119996.log.gz 2026-03-25T15:45:45.917 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120039.log 2026-03-25T15:45:45.918 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120017.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.120017.log.gz 2026-03-25T15:45:45.918 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120060.log 2026-03-25T15:45:45.918 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120039.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.120039.log.gz 2026-03-25T15:45:45.919 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120082.log 2026-03-25T15:45:45.919 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120060.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.120060.log.gz 2026-03-25T15:45:45.919 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120103.log 2026-03-25T15:45:45.919 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120082.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.120082.log.gz 2026-03-25T15:45:45.920 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120125.log 2026-03-25T15:45:45.920 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120103.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.120103.log.gz 2026-03-25T15:45:45.920 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120146.log 2026-03-25T15:45:45.921 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120125.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.120125.log.gz 2026-03-25T15:45:45.921 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120168.log 2026-03-25T15:45:45.921 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120146.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.120146.log.gz 2026-03-25T15:45:45.921 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120189.log 2026-03-25T15:45:45.922 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120168.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.120168.log.gz 2026-03-25T15:45:45.922 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120211.log 2026-03-25T15:45:45.922 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120189.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.120189.log.gz 2026-03-25T15:45:45.923 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120232.log 2026-03-25T15:45:45.923 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120211.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.120211.log.gz 2026-03-25T15:45:45.923 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120254.log 2026-03-25T15:45:45.924 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120232.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.120232.log.gz 2026-03-25T15:45:45.924 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120275.log 2026-03-25T15:45:45.924 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120254.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.120254.log.gz 2026-03-25T15:45:45.924 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120297.log 2026-03-25T15:45:45.925 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120275.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.120275.log.gz 2026-03-25T15:45:45.925 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120318.log 2026-03-25T15:45:45.925 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120297.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.120297.log.gz 2026-03-25T15:45:45.926 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120340.log 2026-03-25T15:45:45.926 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120318.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.120318.log.gz 2026-03-25T15:45:45.926 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120361.log 2026-03-25T15:45:45.926 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120340.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.120340.log.gz 2026-03-25T15:45:45.927 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120383.log 2026-03-25T15:45:45.927 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120361.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.120361.log.gz 2026-03-25T15:45:45.927 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120404.log 2026-03-25T15:45:45.928 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120383.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.120383.log.gz 2026-03-25T15:45:45.928 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120426.log 2026-03-25T15:45:45.928 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120404.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.120404.log.gz 2026-03-25T15:45:45.929 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120447.log 2026-03-25T15:45:45.929 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120426.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.120426.log.gz 2026-03-25T15:45:45.929 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120469.log 2026-03-25T15:45:45.929 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120447.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.120447.log.gz 2026-03-25T15:45:45.930 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120490.log 2026-03-25T15:45:45.930 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120469.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.120469.log.gz 2026-03-25T15:45:45.930 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120512.log 2026-03-25T15:45:45.931 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120490.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.120490.log.gz 2026-03-25T15:45:45.931 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120533.log 2026-03-25T15:45:45.931 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120512.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.120512.log.gz 2026-03-25T15:45:45.932 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120555.log 2026-03-25T15:45:45.932 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120533.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.120533.log.gz 2026-03-25T15:45:45.932 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120576.log 2026-03-25T15:45:45.933 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120555.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.120555.log.gz 2026-03-25T15:45:45.933 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120598.log 2026-03-25T15:45:45.933 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120576.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.120576.log.gz 2026-03-25T15:45:45.933 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120619.log 2026-03-25T15:45:45.934 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120598.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.120598.log.gz 2026-03-25T15:45:45.934 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120641.log 2026-03-25T15:45:45.935 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120619.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.120619.log.gz 2026-03-25T15:45:45.935 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120662.log 2026-03-25T15:45:45.935 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120641.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.120641.log.gz 2026-03-25T15:45:45.936 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120684.log 2026-03-25T15:45:45.936 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120662.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.120662.log.gz 2026-03-25T15:45:45.936 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120705.log 2026-03-25T15:45:45.937 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120684.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.120684.log.gz 2026-03-25T15:45:45.937 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120726.log 2026-03-25T15:45:45.937 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120705.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.120705.log.gz 2026-03-25T15:45:45.938 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120752.log 2026-03-25T15:45:45.938 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120726.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.120726.log.gz 2026-03-25T15:45:45.938 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120769.log 2026-03-25T15:45:45.939 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120752.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.120752.log.gz 2026-03-25T15:45:45.939 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120786.log 2026-03-25T15:45:45.939 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120769.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.120769.log.gz 2026-03-25T15:45:45.940 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120806.log 2026-03-25T15:45:45.940 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120786.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.120786.log.gz 2026-03-25T15:45:45.940 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120826.log 2026-03-25T15:45:45.941 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120806.log: 28.9% -- replaced with /var/log/ceph/ceph-client.admin.120806.log.gz 2026-03-25T15:45:45.941 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120846.log 2026-03-25T15:45:45.941 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120826.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.120826.log.gz 2026-03-25T15:45:45.941 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120863.log 2026-03-25T15:45:45.942 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120846.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.120846.log.gz 2026-03-25T15:45:45.942 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120884.log 2026-03-25T15:45:45.942 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120863.log: 65.7% -- replaced with /var/log/ceph/ceph-client.admin.120863.log.gz 2026-03-25T15:45:45.943 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120904.log 2026-03-25T15:45:45.943 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120884.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.120884.log.gz 2026-03-25T15:45:45.943 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120924.log 2026-03-25T15:45:45.944 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120904.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.120904.log.gz 2026-03-25T15:45:45.944 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120944.log 2026-03-25T15:45:45.944 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120924.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.120924.log.gz 2026-03-25T15:45:45.945 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120964.log 2026-03-25T15:45:45.945 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120944.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.120944.log.gz 2026-03-25T15:45:45.945 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.120984.log 2026-03-25T15:45:45.946 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120964.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.120964.log.gz 2026-03-25T15:45:45.946 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121004.log 2026-03-25T15:45:45.946 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.120984.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.120984.log.gz 2026-03-25T15:45:45.947 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121024.log 2026-03-25T15:45:45.947 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121004.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.121004.log.gz 2026-03-25T15:45:45.947 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121048.log 2026-03-25T15:45:45.948 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121024.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.121024.log.gz 2026-03-25T15:45:45.948 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121072.log 2026-03-25T15:45:45.948 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121048.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.121048.log.gz 2026-03-25T15:45:45.948 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121096.log 2026-03-25T15:45:45.949 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121072.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.121072.log.gz 2026-03-25T15:45:45.949 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121120.log 2026-03-25T15:45:45.949 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121096.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.121096.log.gz 2026-03-25T15:45:45.950 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121144.log 2026-03-25T15:45:45.950 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121120.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.121120.log.gz 2026-03-25T15:45:45.950 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121168.log 2026-03-25T15:45:45.951 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121144.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.121144.log.gz 2026-03-25T15:45:45.951 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121192.log 2026-03-25T15:45:45.952 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121168.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.121168.log.gz 2026-03-25T15:45:45.952 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121216.log 2026-03-25T15:45:45.952 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121192.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.121192.log.gz 2026-03-25T15:45:45.952 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121237.log 2026-03-25T15:45:45.953 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121216.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.121216.log.gz 2026-03-25T15:45:45.953 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121257.log 2026-03-25T15:45:45.953 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121237.log: 59.1% -- replaced with /var/log/ceph/ceph-client.admin.121237.log.gz 2026-03-25T15:45:45.954 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121281.log 2026-03-25T15:45:45.954 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121257.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.121257.log.gz 2026-03-25T15:45:45.954 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121302.log 2026-03-25T15:45:45.955 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121281.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.121281.log.gz 2026-03-25T15:45:45.955 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121324.log 2026-03-25T15:45:45.955 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121302.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.121302.log.gz 2026-03-25T15:45:45.956 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121344.log 2026-03-25T15:45:45.956 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121324.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.121324.log.gz 2026-03-25T15:45:45.956 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121368.log 2026-03-25T15:45:45.956 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121344.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.121344.log.gz 2026-03-25T15:45:45.957 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121393.log 2026-03-25T15:45:45.957 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121368.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.121368.log.gz 2026-03-25T15:45:45.957 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121414.log 2026-03-25T15:45:45.958 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121393.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.121393.log.gz 2026-03-25T15:45:45.958 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121436.log 2026-03-25T15:45:45.958 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121414.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.121414.log.gz 2026-03-25T15:45:45.958 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121457.log 2026-03-25T15:45:45.959 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121436.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.121436.log.gz 2026-03-25T15:45:45.959 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121479.log 2026-03-25T15:45:45.959 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121457.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.121457.log.gz 2026-03-25T15:45:45.960 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121500.log 2026-03-25T15:45:45.960 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121479.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.121479.log.gz 2026-03-25T15:45:45.960 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121522.log 2026-03-25T15:45:45.960 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121500.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.121500.log.gz 2026-03-25T15:45:45.961 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121543.log 2026-03-25T15:45:45.961 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121522.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.121522.log.gz 2026-03-25T15:45:45.961 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121565.log 2026-03-25T15:45:45.962 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121543.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.121543.log.gz 2026-03-25T15:45:45.962 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121586.log 2026-03-25T15:45:45.962 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121565.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.121565.log.gz 2026-03-25T15:45:45.963 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121608.log 2026-03-25T15:45:45.963 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121586.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.121586.log.gz 2026-03-25T15:45:45.963 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121629.log 2026-03-25T15:45:45.964 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121608.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.121608.log.gz 2026-03-25T15:45:45.964 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121651.log 2026-03-25T15:45:45.964 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121629.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.121629.log.gz 2026-03-25T15:45:45.965 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121672.log 2026-03-25T15:45:45.965 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121651.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.121651.log.gz 2026-03-25T15:45:45.965 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121694.log 2026-03-25T15:45:45.966 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121672.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.121672.log.gz 2026-03-25T15:45:45.966 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121715.log 2026-03-25T15:45:45.966 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121694.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.121694.log.gz 2026-03-25T15:45:45.966 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121737.log 2026-03-25T15:45:45.967 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121715.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.121715.log.gz 2026-03-25T15:45:45.967 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121758.log 2026-03-25T15:45:45.967 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121737.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.121737.log.gz 2026-03-25T15:45:45.968 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121780.log 2026-03-25T15:45:45.968 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121758.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.121758.log.gz 2026-03-25T15:45:45.968 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121801.log 2026-03-25T15:45:45.969 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121780.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.121780.log.gz 2026-03-25T15:45:45.969 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121823.log 2026-03-25T15:45:45.969 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121801.log: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.121801.log.gz 2026-03-25T15:45:45.969 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121844.log 2026-03-25T15:45:45.970 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121823.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.121823.log.gz 2026-03-25T15:45:45.970 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121866.log 2026-03-25T15:45:45.970 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121844.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.121844.log.gz 2026-03-25T15:45:45.970 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121887.log 2026-03-25T15:45:45.971 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121866.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.121866.log.gz 2026-03-25T15:45:45.971 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121909.log 2026-03-25T15:45:45.971 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121887.log: 25.7% -- replaced with /var/log/ceph/ceph-client.admin.121887.log.gz 2026-03-25T15:45:45.972 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121930.log 2026-03-25T15:45:45.972 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121909.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.121909.log.gz 2026-03-25T15:45:45.972 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121952.log 2026-03-25T15:45:45.972 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121930.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.121930.log.gz 2026-03-25T15:45:45.973 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121973.log 2026-03-25T15:45:45.973 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121952.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.121952.log.gz 2026-03-25T15:45:45.973 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.121995.log 2026-03-25T15:45:45.974 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121973.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.121973.log.gz 2026-03-25T15:45:45.974 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122016.log 2026-03-25T15:45:45.974 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.121995.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.121995.log.gz 2026-03-25T15:45:45.974 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122038.log 2026-03-25T15:45:45.975 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122016.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122016.log.gz 2026-03-25T15:45:45.975 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122059.log 2026-03-25T15:45:45.975 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122038.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122038.log.gz 2026-03-25T15:45:45.976 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122081.log 2026-03-25T15:45:45.976 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122059.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122059.log.gz 2026-03-25T15:45:45.976 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122102.log 2026-03-25T15:45:45.976 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122081.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122081.log.gz 2026-03-25T15:45:45.977 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122124.log 2026-03-25T15:45:45.977 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122102.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122102.log.gz 2026-03-25T15:45:45.977 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122145.log 2026-03-25T15:45:45.978 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122124.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122124.log.gz 2026-03-25T15:45:45.978 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122167.log 2026-03-25T15:45:45.978 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122145.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122145.log.gz 2026-03-25T15:45:45.978 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122188.log 2026-03-25T15:45:45.979 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122167.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122167.log.gz 2026-03-25T15:45:45.979 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122210.log 2026-03-25T15:45:45.979 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122188.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122188.log.gz 2026-03-25T15:45:45.980 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122231.log 2026-03-25T15:45:45.980 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122210.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122210.log.gz 2026-03-25T15:45:45.980 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122253.log 2026-03-25T15:45:45.980 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122231.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122231.log.gz 2026-03-25T15:45:45.981 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122274.log 2026-03-25T15:45:45.981 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122253.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122253.log.gz 2026-03-25T15:45:45.981 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122296.log 2026-03-25T15:45:45.982 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122274.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122274.log.gz 2026-03-25T15:45:45.982 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122317.log 2026-03-25T15:45:45.982 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122296.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122296.log.gz 2026-03-25T15:45:45.982 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122339.log 2026-03-25T15:45:45.983 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122317.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122317.log.gz 2026-03-25T15:45:45.983 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122360.log 2026-03-25T15:45:45.983 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122339.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122339.log.gz 2026-03-25T15:45:45.983 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122382.log 2026-03-25T15:45:45.984 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122360.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122360.log.gz 2026-03-25T15:45:45.984 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122403.log 2026-03-25T15:45:45.984 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122382.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122382.log.gz 2026-03-25T15:45:45.985 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122425.log 2026-03-25T15:45:45.985 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122403.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122403.log.gz 2026-03-25T15:45:45.985 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122446.log 2026-03-25T15:45:45.985 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122425.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122425.log.gz 2026-03-25T15:45:45.986 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122468.log 2026-03-25T15:45:45.986 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122446.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122446.log.gz 2026-03-25T15:45:45.986 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122489.log 2026-03-25T15:45:45.987 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122468.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122468.log.gz 2026-03-25T15:45:45.987 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122511.log 2026-03-25T15:45:45.987 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122489.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122489.log.gz 2026-03-25T15:45:45.987 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122532.log 2026-03-25T15:45:45.988 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122511.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122511.log.gz 2026-03-25T15:45:45.988 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122554.log 2026-03-25T15:45:45.988 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122532.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122532.log.gz 2026-03-25T15:45:45.988 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122575.log 2026-03-25T15:45:45.989 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122554.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122554.log.gz 2026-03-25T15:45:45.989 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122597.log 2026-03-25T15:45:45.989 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122575.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122575.log.gz 2026-03-25T15:45:45.990 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122618.log 2026-03-25T15:45:45.990 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122597.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122597.log.gz 2026-03-25T15:45:45.990 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122640.log 2026-03-25T15:45:45.991 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122618.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122618.log.gz 2026-03-25T15:45:45.991 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122661.log 2026-03-25T15:45:45.991 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122640.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122640.log.gz 2026-03-25T15:45:45.991 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122683.log 2026-03-25T15:45:45.992 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122661.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122661.log.gz 2026-03-25T15:45:45.992 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122704.log 2026-03-25T15:45:45.992 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122683.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122683.log.gz 2026-03-25T15:45:45.992 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122726.log 2026-03-25T15:45:45.993 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122704.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122704.log.gz 2026-03-25T15:45:45.993 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122747.log 2026-03-25T15:45:45.993 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122726.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122726.log.gz 2026-03-25T15:45:45.993 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122769.log 2026-03-25T15:45:45.994 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122747.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122747.log.gz 2026-03-25T15:45:45.994 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122790.log 2026-03-25T15:45:45.995 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122769.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122769.log.gz 2026-03-25T15:45:45.995 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122812.log 2026-03-25T15:45:45.995 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122790.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122790.log.gz 2026-03-25T15:45:45.995 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122833.log 2026-03-25T15:45:45.996 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122812.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122812.log.gz 2026-03-25T15:45:45.996 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122855.log 2026-03-25T15:45:45.996 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122833.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122833.log.gz 2026-03-25T15:45:45.996 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122876.log 2026-03-25T15:45:45.997 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122855.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122855.log.gz 2026-03-25T15:45:45.997 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122898.log 2026-03-25T15:45:45.997 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122876.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122876.log.gz 2026-03-25T15:45:45.998 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122919.log 2026-03-25T15:45:45.998 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122898.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122898.log.gz 2026-03-25T15:45:45.998 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122940.log 2026-03-25T15:45:45.998 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122919.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122919.log.gz 2026-03-25T15:45:45.999 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122966.log 2026-03-25T15:45:45.999 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122940.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122940.log.gz 2026-03-25T15:45:45.999 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.122983.log 2026-03-25T15:45:46.000 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122966.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122966.log.gz 2026-03-25T15:45:46.000 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.123000.log 2026-03-25T15:45:46.000 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.122983.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.122983.log.gz 2026-03-25T15:45:46.000 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.123021.log 2026-03-25T15:45:46.001 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.123000.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.123000.log.gz 2026-03-25T15:45:46.001 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.123041.log 2026-03-25T15:45:46.001 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.123021.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.123021.log.gz 2026-03-25T15:45:46.001 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.123061.log 2026-03-25T15:45:46.002 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.123041.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.123041.log.gz 2026-03-25T15:45:46.002 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.123082.log 2026-03-25T15:45:46.002 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.123061.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.123061.log.gz 2026-03-25T15:45:46.002 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.123109.log 2026-03-25T15:45:46.003 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.123082.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.123082.log.gz 2026-03-25T15:45:46.003 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.123133.log 2026-03-25T15:45:46.003 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.123109.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.123109.log.gz 2026-03-25T15:45:46.004 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.123171.log 2026-03-25T15:45:46.004 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.123133.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.123133.log.gz 2026-03-25T15:45:46.004 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.123195.log 2026-03-25T15:45:46.004 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.123171.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.123171.log.gz 2026-03-25T15:45:46.005 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.123219.log 2026-03-25T15:45:46.005 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.123195.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.123195.log.gz 2026-03-25T15:45:46.005 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.123241.log 2026-03-25T15:45:46.006 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.123219.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.123219.log.gz 2026-03-25T15:45:46.006 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.123262.log 2026-03-25T15:45:46.006 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.123241.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.123241.log.gz 2026-03-25T15:45:46.006 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.123284.log 2026-03-25T15:45:46.007 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.123262.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.123262.log.gz 2026-03-25T15:45:46.007 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.123304.log 2026-03-25T15:45:46.007 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.123284.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.123284.log.gz 2026-03-25T15:45:46.007 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.123328.log 2026-03-25T15:45:46.008 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.123304.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.123304.log.gz 2026-03-25T15:45:46.008 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.123477.log 2026-03-25T15:45:46.008 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.123328.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.123328.log.gz 2026-03-25T15:45:46.009 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.123529.log 2026-03-25T15:45:46.009 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.123477.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.123477.log.gz 2026-03-25T15:45:46.009 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.123585.log 2026-03-25T15:45:46.009 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.123529.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.123529.log.gz 2026-03-25T15:45:46.010 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.123635.log 2026-03-25T15:45:46.010 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.123585.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.123585.log.gz 2026-03-25T15:45:46.010 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.123684.log 2026-03-25T15:45:46.011 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.123635.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.123635.log.gz 2026-03-25T15:45:46.011 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.123733.log 2026-03-25T15:45:46.011 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.123684.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.123684.log.gz 2026-03-25T15:45:46.011 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.123782.log 2026-03-25T15:45:46.012 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.123733.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.123733.log.gz 2026-03-25T15:45:46.012 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.123831.log 2026-03-25T15:45:46.012 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.123782.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.123782.log.gz 2026-03-25T15:45:46.012 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.123880.log 2026-03-25T15:45:46.013 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.123831.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.123831.log.gz 2026-03-25T15:45:46.013 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.123929.log 2026-03-25T15:45:46.013 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.123880.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.123880.log.gz 2026-03-25T15:45:46.014 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.123978.log 2026-03-25T15:45:46.014 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.123929.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.123929.log.gz 2026-03-25T15:45:46.014 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.124028.log 2026-03-25T15:45:46.015 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.123978.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.123978.log.gz 2026-03-25T15:45:46.015 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.124075.log 2026-03-25T15:45:46.015 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.124028.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.124028.log.gz 2026-03-25T15:45:46.016 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/ceph-client.admin.124075.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.124075.log.gz 2026-03-25T15:45:50.246 INFO:teuthology.orchestra.run.vm04.stderr: 93.8% -- replaced with /var/log/ceph/ceph-osd.2.log.gz 2026-03-25T15:45:51.950 INFO:teuthology.orchestra.run.vm04.stderr: 93.8% -- replaced with /var/log/ceph/ceph-osd.0.log.gz 2026-03-25T15:45:55.515 INFO:teuthology.orchestra.run.vm04.stderr: 94.1% -- replaced with /var/log/ceph/ceph-osd.1.log.gz 2026-03-25T15:45:55.516 INFO:teuthology.orchestra.run.vm04.stderr: 2026-03-25T15:45:55.517 INFO:teuthology.orchestra.run.vm04.stderr:real 0m12.090s 2026-03-25T15:45:55.517 INFO:teuthology.orchestra.run.vm04.stderr:user 0m27.576s 2026-03-25T15:45:55.517 INFO:teuthology.orchestra.run.vm04.stderr:sys 0m2.202s 2026-03-25T15:45:55.517 INFO:tasks.ceph:Archiving logs... 2026-03-25T15:45:55.517 DEBUG:teuthology.misc:Transferring archived files from vm04:/var/log/ceph to /archive/kyr-2026-03-20_22:04:26-rbd-tentacle-none-default-vps/3645/remote/vm04/log 2026-03-25T15:45:55.517 DEBUG:teuthology.orchestra.run.vm04:> sudo tar c -f - -C /var/log/ceph -- . 2026-03-25T15:45:57.076 DEBUG:teuthology.run_tasks:Unwinding manager install 2026-03-25T15:45:57.078 INFO:teuthology.task.install.util:Removing shipped files: /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer... 2026-03-25T15:45:57.078 DEBUG:teuthology.orchestra.run.vm04:> sudo rm -f -- /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer 2026-03-25T15:45:57.118 INFO:teuthology.task.install.rpm:Removing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, ceph-volume, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd on rpm system. 2026-03-25T15:45:57.118 DEBUG:teuthology.orchestra.run.vm04:> 2026-03-25T15:45:57.118 DEBUG:teuthology.orchestra.run.vm04:> for d in ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse ceph-volume librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd ; do 2026-03-25T15:45:57.118 DEBUG:teuthology.orchestra.run.vm04:> sudo yum -y remove $d || true 2026-03-25T15:45:57.118 DEBUG:teuthology.orchestra.run.vm04:> done 2026-03-25T15:45:57.437 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-25T15:45:57.437 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-25T15:45:57.437 INFO:teuthology.orchestra.run.vm04.stdout: Package Arch Version Repository Size 2026-03-25T15:45:57.437 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-25T15:45:57.437 INFO:teuthology.orchestra.run.vm04.stdout:Removing: 2026-03-25T15:45:57.437 INFO:teuthology.orchestra.run.vm04.stdout: ceph-radosgw x86_64 2:20.2.0-712.g70f8415b.el9 @ceph 103 M 2026-03-25T15:45:57.437 INFO:teuthology.orchestra.run.vm04.stdout:Removing unused dependencies: 2026-03-25T15:45:57.438 INFO:teuthology.orchestra.run.vm04.stdout: mailcap noarch 2.1.49-5.el9 @baseos 78 k 2026-03-25T15:45:57.438 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:45:57.438 INFO:teuthology.orchestra.run.vm04.stdout:Transaction Summary 2026-03-25T15:45:57.438 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-25T15:45:57.438 INFO:teuthology.orchestra.run.vm04.stdout:Remove 2 Packages 2026-03-25T15:45:57.438 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:45:57.438 INFO:teuthology.orchestra.run.vm04.stdout:Freed space: 103 M 2026-03-25T15:45:57.438 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction check 2026-03-25T15:45:57.442 INFO:teuthology.orchestra.run.vm04.stdout:Transaction check succeeded. 2026-03-25T15:45:57.442 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction test 2026-03-25T15:45:57.461 INFO:teuthology.orchestra.run.vm04.stdout:Transaction test succeeded. 2026-03-25T15:45:57.462 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction 2026-03-25T15:45:57.506 INFO:teuthology.orchestra.run.vm04.stdout: Preparing : 1/1 2026-03-25T15:45:57.530 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-radosgw-2:20.2.0-712.g70f8415b.el9.x86_64 1/2 2026-03-25T15:45:57.530 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-25T15:45:57.530 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-25T15:45:57.530 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-radosgw.target". 2026-03-25T15:45:57.530 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-radosgw.target". 2026-03-25T15:45:57.530 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:45:57.533 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-radosgw-2:20.2.0-712.g70f8415b.el9.x86_64 1/2 2026-03-25T15:45:57.569 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-radosgw-2:20.2.0-712.g70f8415b.el9.x86_64 1/2 2026-03-25T15:45:57.585 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-25T15:45:58.289 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: mailcap-2.1.49-5.el9.noarch 2/2 2026-03-25T15:45:58.290 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-radosgw-2:20.2.0-712.g70f8415b.el9.x86_64 1/2 2026-03-25T15:45:58.347 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-25T15:45:58.348 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:45:58.348 INFO:teuthology.orchestra.run.vm04.stdout:Removed: 2026-03-25T15:45:58.348 INFO:teuthology.orchestra.run.vm04.stdout: ceph-radosgw-2:20.2.0-712.g70f8415b.el9.x86_64 mailcap-2.1.49-5.el9.noarch 2026-03-25T15:45:58.348 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:45:58.348 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-25T15:45:58.557 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-25T15:45:58.558 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-25T15:45:58.558 INFO:teuthology.orchestra.run.vm04.stdout: Package Arch Version Repository Size 2026-03-25T15:45:58.558 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-25T15:45:58.558 INFO:teuthology.orchestra.run.vm04.stdout:Removing: 2026-03-25T15:45:58.558 INFO:teuthology.orchestra.run.vm04.stdout: ceph-test x86_64 2:20.2.0-712.g70f8415b.el9 @ceph 362 M 2026-03-25T15:45:58.558 INFO:teuthology.orchestra.run.vm04.stdout:Removing unused dependencies: 2026-03-25T15:45:58.558 INFO:teuthology.orchestra.run.vm04.stdout: libxslt x86_64 1.1.34-12.el9 @appstream 743 k 2026-03-25T15:45:58.558 INFO:teuthology.orchestra.run.vm04.stdout: socat x86_64 1.7.4.1-8.el9 @appstream 1.1 M 2026-03-25T15:45:58.558 INFO:teuthology.orchestra.run.vm04.stdout: xmlstarlet x86_64 1.6.1-20.el9 @appstream 195 k 2026-03-25T15:45:58.558 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:45:58.558 INFO:teuthology.orchestra.run.vm04.stdout:Transaction Summary 2026-03-25T15:45:58.558 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-25T15:45:58.558 INFO:teuthology.orchestra.run.vm04.stdout:Remove 4 Packages 2026-03-25T15:45:58.558 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:45:58.558 INFO:teuthology.orchestra.run.vm04.stdout:Freed space: 364 M 2026-03-25T15:45:58.558 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction check 2026-03-25T15:45:58.561 INFO:teuthology.orchestra.run.vm04.stdout:Transaction check succeeded. 2026-03-25T15:45:58.561 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction test 2026-03-25T15:45:58.590 INFO:teuthology.orchestra.run.vm04.stdout:Transaction test succeeded. 2026-03-25T15:45:58.590 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction 2026-03-25T15:45:58.658 INFO:teuthology.orchestra.run.vm04.stdout: Preparing : 1/1 2026-03-25T15:45:58.665 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-test-2:20.2.0-712.g70f8415b.el9.x86_64 1/4 2026-03-25T15:45:58.669 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : xmlstarlet-1.6.1-20.el9.x86_64 2/4 2026-03-25T15:45:58.673 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libxslt-1.1.34-12.el9.x86_64 3/4 2026-03-25T15:45:58.688 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : socat-1.7.4.1-8.el9.x86_64 4/4 2026-03-25T15:45:58.754 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: socat-1.7.4.1-8.el9.x86_64 4/4 2026-03-25T15:45:58.754 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-test-2:20.2.0-712.g70f8415b.el9.x86_64 1/4 2026-03-25T15:45:58.755 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 2/4 2026-03-25T15:45:58.755 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 3/4 2026-03-25T15:45:58.803 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 4/4 2026-03-25T15:45:58.803 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:45:58.803 INFO:teuthology.orchestra.run.vm04.stdout:Removed: 2026-03-25T15:45:58.803 INFO:teuthology.orchestra.run.vm04.stdout: ceph-test-2:20.2.0-712.g70f8415b.el9.x86_64 libxslt-1.1.34-12.el9.x86_64 2026-03-25T15:45:58.803 INFO:teuthology.orchestra.run.vm04.stdout: socat-1.7.4.1-8.el9.x86_64 xmlstarlet-1.6.1-20.el9.x86_64 2026-03-25T15:45:58.803 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:45:58.803 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-25T15:45:59.000 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-25T15:45:59.001 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-25T15:45:59.001 INFO:teuthology.orchestra.run.vm04.stdout: Package Arch Version Repository Size 2026-03-25T15:45:59.001 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-25T15:45:59.001 INFO:teuthology.orchestra.run.vm04.stdout:Removing: 2026-03-25T15:45:59.001 INFO:teuthology.orchestra.run.vm04.stdout: ceph x86_64 2:20.2.0-712.g70f8415b.el9 @ceph 0 2026-03-25T15:45:59.001 INFO:teuthology.orchestra.run.vm04.stdout:Removing unused dependencies: 2026-03-25T15:45:59.001 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mds x86_64 2:20.2.0-712.g70f8415b.el9 @ceph 6.8 M 2026-03-25T15:45:59.001 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mon x86_64 2:20.2.0-712.g70f8415b.el9 @ceph 19 M 2026-03-25T15:45:59.001 INFO:teuthology.orchestra.run.vm04.stdout: lua x86_64 5.4.4-4.el9 @appstream 593 k 2026-03-25T15:45:59.001 INFO:teuthology.orchestra.run.vm04.stdout: lua-devel x86_64 5.4.4-4.el9 @crb 49 k 2026-03-25T15:45:59.001 INFO:teuthology.orchestra.run.vm04.stdout: luarocks noarch 3.9.2-5.el9 @epel 692 k 2026-03-25T15:45:59.001 INFO:teuthology.orchestra.run.vm04.stdout: unzip x86_64 6.0-59.el9 @baseos 389 k 2026-03-25T15:45:59.001 INFO:teuthology.orchestra.run.vm04.stdout: zip x86_64 3.0-35.el9 @baseos 724 k 2026-03-25T15:45:59.001 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:45:59.001 INFO:teuthology.orchestra.run.vm04.stdout:Transaction Summary 2026-03-25T15:45:59.001 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-25T15:45:59.001 INFO:teuthology.orchestra.run.vm04.stdout:Remove 8 Packages 2026-03-25T15:45:59.001 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:45:59.001 INFO:teuthology.orchestra.run.vm04.stdout:Freed space: 28 M 2026-03-25T15:45:59.001 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction check 2026-03-25T15:45:59.004 INFO:teuthology.orchestra.run.vm04.stdout:Transaction check succeeded. 2026-03-25T15:45:59.004 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction test 2026-03-25T15:45:59.030 INFO:teuthology.orchestra.run.vm04.stdout:Transaction test succeeded. 2026-03-25T15:45:59.031 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction 2026-03-25T15:45:59.073 INFO:teuthology.orchestra.run.vm04.stdout: Preparing : 1/1 2026-03-25T15:45:59.078 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-2:20.2.0-712.g70f8415b.el9.x86_64 1/8 2026-03-25T15:45:59.081 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : luarocks-3.9.2-5.el9.noarch 2/8 2026-03-25T15:45:59.084 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : lua-devel-5.4.4-4.el9.x86_64 3/8 2026-03-25T15:45:59.087 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : zip-3.0-35.el9.x86_64 4/8 2026-03-25T15:45:59.089 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : unzip-6.0-59.el9.x86_64 5/8 2026-03-25T15:45:59.091 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : lua-5.4.4-4.el9.x86_64 6/8 2026-03-25T15:45:59.110 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mds-2:20.2.0-712.g70f8415b.el9.x86_64 7/8 2026-03-25T15:45:59.110 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-25T15:45:59.110 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-25T15:45:59.110 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mds.target". 2026-03-25T15:45:59.110 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mds.target". 2026-03-25T15:45:59.110 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:45:59.111 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-mds-2:20.2.0-712.g70f8415b.el9.x86_64 7/8 2026-03-25T15:45:59.120 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mds-2:20.2.0-712.g70f8415b.el9.x86_64 7/8 2026-03-25T15:45:59.140 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mon-2:20.2.0-712.g70f8415b.el9.x86_64 8/8 2026-03-25T15:45:59.140 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-25T15:45:59.140 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-25T15:45:59.140 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mon.target". 2026-03-25T15:45:59.140 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mon.target". 2026-03-25T15:45:59.140 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:45:59.142 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-mon-2:20.2.0-712.g70f8415b.el9.x86_64 8/8 2026-03-25T15:45:59.237 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mon-2:20.2.0-712.g70f8415b.el9.x86_64 8/8 2026-03-25T15:45:59.238 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-2:20.2.0-712.g70f8415b.el9.x86_64 1/8 2026-03-25T15:45:59.238 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mds-2:20.2.0-712.g70f8415b.el9.x86_64 2/8 2026-03-25T15:45:59.238 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mon-2:20.2.0-712.g70f8415b.el9.x86_64 3/8 2026-03-25T15:45:59.238 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : lua-5.4.4-4.el9.x86_64 4/8 2026-03-25T15:45:59.238 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : lua-devel-5.4.4-4.el9.x86_64 5/8 2026-03-25T15:45:59.238 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : luarocks-3.9.2-5.el9.noarch 6/8 2026-03-25T15:45:59.238 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : unzip-6.0-59.el9.x86_64 7/8 2026-03-25T15:45:59.409 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : zip-3.0-35.el9.x86_64 8/8 2026-03-25T15:45:59.409 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:45:59.409 INFO:teuthology.orchestra.run.vm04.stdout:Removed: 2026-03-25T15:45:59.409 INFO:teuthology.orchestra.run.vm04.stdout: ceph-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:45:59.409 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mds-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:45:59.409 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mon-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:45:59.409 INFO:teuthology.orchestra.run.vm04.stdout: lua-5.4.4-4.el9.x86_64 2026-03-25T15:45:59.409 INFO:teuthology.orchestra.run.vm04.stdout: lua-devel-5.4.4-4.el9.x86_64 2026-03-25T15:45:59.410 INFO:teuthology.orchestra.run.vm04.stdout: luarocks-3.9.2-5.el9.noarch 2026-03-25T15:45:59.410 INFO:teuthology.orchestra.run.vm04.stdout: unzip-6.0-59.el9.x86_64 2026-03-25T15:45:59.410 INFO:teuthology.orchestra.run.vm04.stdout: zip-3.0-35.el9.x86_64 2026-03-25T15:45:59.410 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:45:59.410 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-25T15:45:59.623 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-25T15:45:59.630 INFO:teuthology.orchestra.run.vm04.stdout:=========================================================================================== 2026-03-25T15:45:59.630 INFO:teuthology.orchestra.run.vm04.stdout: Package Arch Version Repository Size 2026-03-25T15:45:59.630 INFO:teuthology.orchestra.run.vm04.stdout:=========================================================================================== 2026-03-25T15:45:59.630 INFO:teuthology.orchestra.run.vm04.stdout:Removing: 2026-03-25T15:45:59.630 INFO:teuthology.orchestra.run.vm04.stdout: ceph-base x86_64 2:20.2.0-712.g70f8415b.el9 @ceph 24 M 2026-03-25T15:45:59.630 INFO:teuthology.orchestra.run.vm04.stdout:Removing dependent packages: 2026-03-25T15:45:59.630 INFO:teuthology.orchestra.run.vm04.stdout: ceph-immutable-object-cache x86_64 2:20.2.0-712.g70f8415b.el9 @ceph 447 k 2026-03-25T15:45:59.630 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr x86_64 2:20.2.0-712.g70f8415b.el9 @ceph 2.9 M 2026-03-25T15:45:59.630 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-cephadm noarch 2:20.2.0-712.g70f8415b.el9 @ceph-noarch 940 k 2026-03-25T15:45:59.630 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-dashboard noarch 2:20.2.0-712.g70f8415b.el9 @ceph-noarch 140 M 2026-03-25T15:45:59.630 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-diskprediction-local noarch 2:20.2.0-712.g70f8415b.el9 @ceph-noarch 66 M 2026-03-25T15:45:59.630 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-rook noarch 2:20.2.0-712.g70f8415b.el9 @ceph-noarch 567 k 2026-03-25T15:45:59.630 INFO:teuthology.orchestra.run.vm04.stdout: ceph-osd x86_64 2:20.2.0-712.g70f8415b.el9 @ceph 54 M 2026-03-25T15:45:59.630 INFO:teuthology.orchestra.run.vm04.stdout: ceph-volume noarch 2:20.2.0-712.g70f8415b.el9 @ceph-noarch 1.4 M 2026-03-25T15:45:59.630 INFO:teuthology.orchestra.run.vm04.stdout: rbd-mirror x86_64 2:20.2.0-712.g70f8415b.el9 @ceph 11 M 2026-03-25T15:45:59.630 INFO:teuthology.orchestra.run.vm04.stdout:Removing unused dependencies: 2026-03-25T15:45:59.630 INFO:teuthology.orchestra.run.vm04.stdout: abseil-cpp x86_64 20211102.0-4.el9 @epel 1.9 M 2026-03-25T15:45:59.630 INFO:teuthology.orchestra.run.vm04.stdout: ceph-common x86_64 2:20.2.0-712.g70f8415b.el9 @ceph 98 M 2026-03-25T15:45:59.630 INFO:teuthology.orchestra.run.vm04.stdout: ceph-grafana-dashboards noarch 2:20.2.0-712.g70f8415b.el9 @ceph-noarch 996 k 2026-03-25T15:45:59.630 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-modules-core noarch 2:20.2.0-712.g70f8415b.el9 @ceph-noarch 1.6 M 2026-03-25T15:45:59.630 INFO:teuthology.orchestra.run.vm04.stdout: ceph-prometheus-alerts noarch 2:20.2.0-712.g70f8415b.el9 @ceph-noarch 59 k 2026-03-25T15:45:59.630 INFO:teuthology.orchestra.run.vm04.stdout: ceph-selinux x86_64 2:20.2.0-712.g70f8415b.el9 @ceph 138 k 2026-03-25T15:45:59.630 INFO:teuthology.orchestra.run.vm04.stdout: cryptsetup x86_64 2.8.1-3.el9 @baseos 770 k 2026-03-25T15:45:59.630 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas x86_64 3.0.4-9.el9 @appstream 68 k 2026-03-25T15:45:59.630 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 @appstream 11 M 2026-03-25T15:45:59.630 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 @appstream 39 k 2026-03-25T15:45:59.630 INFO:teuthology.orchestra.run.vm04.stdout: gperftools-libs x86_64 2.9.1-3.el9 @epel 1.4 M 2026-03-25T15:45:59.630 INFO:teuthology.orchestra.run.vm04.stdout: grpc-data noarch 1.46.7-10.el9 @epel 13 k 2026-03-25T15:45:59.630 INFO:teuthology.orchestra.run.vm04.stdout: ledmon-libs x86_64 1.1.0-3.el9 @baseos 80 k 2026-03-25T15:45:59.630 INFO:teuthology.orchestra.run.vm04.stdout: libcephsqlite x86_64 2:20.2.0-712.g70f8415b.el9 @ceph 409 k 2026-03-25T15:45:59.630 INFO:teuthology.orchestra.run.vm04.stdout: libconfig x86_64 1.7.2-9.el9 @baseos 220 k 2026-03-25T15:45:59.630 INFO:teuthology.orchestra.run.vm04.stdout: libgfortran x86_64 11.5.0-14.el9 @baseos 2.8 M 2026-03-25T15:45:59.630 INFO:teuthology.orchestra.run.vm04.stdout: liboath x86_64 2.6.12-1.el9 @epel 94 k 2026-03-25T15:45:59.630 INFO:teuthology.orchestra.run.vm04.stdout: libquadmath x86_64 11.5.0-14.el9 @baseos 330 k 2026-03-25T15:45:59.630 INFO:teuthology.orchestra.run.vm04.stdout: libradosstriper1 x86_64 2:20.2.0-712.g70f8415b.el9 @ceph 792 k 2026-03-25T15:45:59.630 INFO:teuthology.orchestra.run.vm04.stdout: libstoragemgmt x86_64 1.10.1-1.el9 @appstream 685 k 2026-03-25T15:45:59.630 INFO:teuthology.orchestra.run.vm04.stdout: libunwind x86_64 1.6.2-1.el9 @epel 170 k 2026-03-25T15:45:59.630 INFO:teuthology.orchestra.run.vm04.stdout: openblas x86_64 0.3.29-1.el9 @appstream 112 k 2026-03-25T15:45:59.630 INFO:teuthology.orchestra.run.vm04.stdout: openblas-openmp x86_64 0.3.29-1.el9 @appstream 46 M 2026-03-25T15:45:59.630 INFO:teuthology.orchestra.run.vm04.stdout: pciutils x86_64 3.7.0-7.el9 @baseos 216 k 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: protobuf x86_64 3.14.0-17.el9 @appstream 3.5 M 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: protobuf-compiler x86_64 3.14.0-17.el9 @crb 2.9 M 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-asyncssh noarch 2.13.2-5.el9 @epel 3.9 M 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-autocommand noarch 2.2.2-8.el9 @epel 82 k 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-babel noarch 2.9.1-2.el9 @appstream 27 M 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 @epel 254 k 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-bcrypt x86_64 3.2.2-1.el9 @epel 87 k 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-cachetools noarch 4.2.4-1.el9 @epel 93 k 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-ceph-common x86_64 2:20.2.0-712.g70f8415b.el9 @ceph 855 k 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-certifi noarch 2023.05.07-4.el9 @epel 6.3 k 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-cffi x86_64 1.14.5-5.el9 @baseos 1.0 M 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-chardet noarch 4.0.0-5.el9 @anaconda 1.4 M 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-cheroot noarch 10.0.1-4.el9 @epel 682 k 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-cherrypy noarch 18.6.1-2.el9 @epel 1.1 M 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-cryptography x86_64 36.0.1-5.el9 @baseos 4.5 M 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-devel x86_64 3.9.25-3.el9 @appstream 765 k 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-google-auth noarch 1:2.45.0-1.el9 @epel 1.4 M 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-grpcio x86_64 1.46.7-10.el9 @epel 6.7 M 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-grpcio-tools x86_64 1.46.7-10.el9 @epel 418 k 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-idna noarch 2.10-7.el9.1 @anaconda 513 k 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco noarch 8.2.1-3.el9 @epel 3.7 k 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 @epel 24 k 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 @epel 55 k 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-context noarch 6.0.1-3.el9 @epel 31 k 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 @epel 33 k 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-text noarch 4.0.0-2.el9 @epel 51 k 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-jinja2 noarch 2.11.3-8.el9 @appstream 1.1 M 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-jsonpatch noarch 1.21-16.el9 @koji-override-0 55 k 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-jsonpointer noarch 2.0-4.el9 @koji-override-0 34 k 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 @epel 21 M 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 @appstream 832 k 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-markupsafe x86_64 1.1.1-12.el9 @appstream 60 k 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-more-itertools noarch 8.12.0-2.el9 @epel 378 k 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-natsort noarch 7.1.1-5.el9 @epel 215 k 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-numpy x86_64 1:1.23.5-2.el9 @appstream 30 M 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 @appstream 1.7 M 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-oauthlib noarch 3.1.1-5.el9 @koji-override-0 888 k 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-packaging noarch 20.9-5.el9 @appstream 248 k 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-ply noarch 3.11-14.el9 @baseos 430 k 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-portend noarch 3.1.0-2.el9 @epel 20 k 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-prettytable noarch 0.7.2-27.el9 @koji-override-0 166 k 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-protobuf noarch 3.14.0-17.el9 @appstream 1.4 M 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 @epel 389 k 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyasn1 noarch 0.4.8-7.el9 @appstream 622 k 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 @appstream 1.0 M 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-pycparser noarch 2.20-6.el9 @baseos 745 k 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyparsing noarch 2.4.7-9.el9 @baseos 635 k 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-pysocks noarch 1.7.1-12.el9 @anaconda 88 k 2026-03-25T15:45:59.631 INFO:teuthology.orchestra.run.vm04.stdout: python3-pytz noarch 2021.1-5.el9 @koji-override-0 176 k 2026-03-25T15:45:59.632 INFO:teuthology.orchestra.run.vm04.stdout: python3-repoze-lru noarch 0.7-16.el9 @epel 83 k 2026-03-25T15:45:59.632 INFO:teuthology.orchestra.run.vm04.stdout: python3-requests noarch 2.25.1-10.el9 @baseos 405 k 2026-03-25T15:45:59.632 INFO:teuthology.orchestra.run.vm04.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 @appstream 119 k 2026-03-25T15:45:59.632 INFO:teuthology.orchestra.run.vm04.stdout: python3-routes noarch 2.5.1-5.el9 @epel 459 k 2026-03-25T15:45:59.632 INFO:teuthology.orchestra.run.vm04.stdout: python3-rsa noarch 4.9-2.el9 @epel 202 k 2026-03-25T15:45:59.632 INFO:teuthology.orchestra.run.vm04.stdout: python3-scipy x86_64 1.9.3-2.el9 @appstream 76 M 2026-03-25T15:45:59.632 INFO:teuthology.orchestra.run.vm04.stdout: python3-tempora noarch 5.0.0-2.el9 @epel 96 k 2026-03-25T15:45:59.632 INFO:teuthology.orchestra.run.vm04.stdout: python3-toml noarch 0.10.2-6.el9 @appstream 99 k 2026-03-25T15:45:59.632 INFO:teuthology.orchestra.run.vm04.stdout: python3-typing-extensions noarch 4.15.0-1.el9 @epel 447 k 2026-03-25T15:45:59.632 INFO:teuthology.orchestra.run.vm04.stdout: python3-urllib3 noarch 1.26.5-7.el9 @baseos 746 k 2026-03-25T15:45:59.632 INFO:teuthology.orchestra.run.vm04.stdout: python3-websocket-client noarch 1.2.3-2.el9 @epel 319 k 2026-03-25T15:45:59.632 INFO:teuthology.orchestra.run.vm04.stdout: python3-zc-lockfile noarch 2.0-10.el9 @epel 35 k 2026-03-25T15:45:59.632 INFO:teuthology.orchestra.run.vm04.stdout: qatlib x86_64 25.08.0-2.el9 @appstream 639 k 2026-03-25T15:45:59.632 INFO:teuthology.orchestra.run.vm04.stdout: qatlib-service x86_64 25.08.0-2.el9 @appstream 69 k 2026-03-25T15:45:59.632 INFO:teuthology.orchestra.run.vm04.stdout: qatzip-libs x86_64 1.3.1-1.el9 @appstream 148 k 2026-03-25T15:45:59.632 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:45:59.632 INFO:teuthology.orchestra.run.vm04.stdout:Transaction Summary 2026-03-25T15:45:59.632 INFO:teuthology.orchestra.run.vm04.stdout:=========================================================================================== 2026-03-25T15:45:59.632 INFO:teuthology.orchestra.run.vm04.stdout:Remove 98 Packages 2026-03-25T15:45:59.632 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:45:59.632 INFO:teuthology.orchestra.run.vm04.stdout:Freed space: 666 M 2026-03-25T15:45:59.632 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction check 2026-03-25T15:45:59.659 INFO:teuthology.orchestra.run.vm04.stdout:Transaction check succeeded. 2026-03-25T15:45:59.659 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction test 2026-03-25T15:45:59.789 INFO:teuthology.orchestra.run.vm04.stdout:Transaction test succeeded. 2026-03-25T15:45:59.789 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction 2026-03-25T15:45:59.955 INFO:teuthology.orchestra.run.vm04.stdout: Preparing : 1/1 2026-03-25T15:45:59.955 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-mgr-rook-2:20.2.0-712.g70f8415b.el9.noarch 1/98 2026-03-25T15:45:59.963 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mgr-rook-2:20.2.0-712.g70f8415b.el9.noarch 1/98 2026-03-25T15:45:59.985 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mgr-2:20.2.0-712.g70f8415b.el9.x86_64 2/98 2026-03-25T15:45:59.985 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-25T15:45:59.985 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-25T15:45:59.985 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mgr.target". 2026-03-25T15:45:59.985 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mgr.target". 2026-03-25T15:45:59.985 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:45:59.986 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-mgr-2:20.2.0-712.g70f8415b.el9.x86_64 2/98 2026-03-25T15:45:59.999 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mgr-2:20.2.0-712.g70f8415b.el9.x86_64 2/98 2026-03-25T15:46:00.053 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-mgr-modules-core-2:20.2.0-712.g70f8415b.el9.n 3/98 2026-03-25T15:46:00.054 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-mgr-dashboard-2:20.2.0-712.g70f8415b.el9.noar 4/98 2026-03-25T15:46:00.113 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mgr-dashboard-2:20.2.0-712.g70f8415b.el9.noar 4/98 2026-03-25T15:46:00.121 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-kubernetes-1:26.1.0-3.el9.noarch 5/98 2026-03-25T15:46:00.126 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-requests-oauthlib-1.3.0-12.el9.noarch 6/98 2026-03-25T15:46:00.126 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-mgr-cephadm-2:20.2.0-712.g70f8415b.el9.noarch 7/98 2026-03-25T15:46:00.139 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mgr-cephadm-2:20.2.0-712.g70f8415b.el9.noarch 7/98 2026-03-25T15:46:00.146 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-cherrypy-18.6.1-2.el9.noarch 8/98 2026-03-25T15:46:00.151 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-cheroot-10.0.1-4.el9.noarch 9/98 2026-03-25T15:46:00.160 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-grpcio-tools-1.46.7-10.el9.x86_64 10/98 2026-03-25T15:46:00.164 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-grpcio-1.46.7-10.el9.x86_64 11/98 2026-03-25T15:46:00.188 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-osd-2:20.2.0-712.g70f8415b.el9.x86_64 12/98 2026-03-25T15:46:00.188 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-25T15:46:00.188 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-25T15:46:00.188 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-osd.target". 2026-03-25T15:46:00.189 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-osd.target". 2026-03-25T15:46:00.189 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:46:00.192 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-osd-2:20.2.0-712.g70f8415b.el9.x86_64 12/98 2026-03-25T15:46:00.201 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-osd-2:20.2.0-712.g70f8415b.el9.x86_64 12/98 2026-03-25T15:46:00.217 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-volume-2:20.2.0-712.g70f8415b.el9.noarch 13/98 2026-03-25T15:46:00.217 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-25T15:46:00.217 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-volume@*.service" escaped as "ceph-volume@\x2a.service". 2026-03-25T15:46:00.217 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:46:00.225 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-volume-2:20.2.0-712.g70f8415b.el9.noarch 13/98 2026-03-25T15:46:00.234 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-volume-2:20.2.0-712.g70f8415b.el9.noarch 13/98 2026-03-25T15:46:00.237 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-jaraco-collections-3.0.0-8.el9.noarch 14/98 2026-03-25T15:46:00.242 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-jaraco-text-4.0.0-2.el9.noarch 15/98 2026-03-25T15:46:00.247 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-jinja2-2.11.3-8.el9.noarch 16/98 2026-03-25T15:46:00.255 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-requests-2.25.1-10.el9.noarch 17/98 2026-03-25T15:46:00.260 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-google-auth-1:2.45.0-1.el9.noarch 18/98 2026-03-25T15:46:00.269 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-rsa-4.9-2.el9.noarch 19/98 2026-03-25T15:46:00.276 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-pyasn1-modules-0.4.8-7.el9.noarch 20/98 2026-03-25T15:46:00.305 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-urllib3-1.26.5-7.el9.noarch 21/98 2026-03-25T15:46:00.314 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-babel-2.9.1-2.el9.noarch 22/98 2026-03-25T15:46:00.317 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-jaraco-classes-3.2.1-5.el9.noarch 23/98 2026-03-25T15:46:00.326 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-pyOpenSSL-21.0.0-1.el9.noarch 24/98 2026-03-25T15:46:00.333 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-asyncssh-2.13.2-5.el9.noarch 25/98 2026-03-25T15:46:00.333 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-mgr-diskprediction-local-2:20.2.0-712.g70f841 26/98 2026-03-25T15:46:00.341 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:20.2.0-712.g70f841 26/98 2026-03-25T15:46:00.437 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-jsonpatch-1.21-16.el9.noarch 27/98 2026-03-25T15:46:00.455 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-scipy-1.9.3-2.el9.x86_64 28/98 2026-03-25T15:46:00.470 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 29/98 2026-03-25T15:46:00.470 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/multi-user.target.wants/libstoragemgmt.service". 2026-03-25T15:46:00.470 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:46:00.471 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libstoragemgmt-1.10.1-1.el9.x86_64 29/98 2026-03-25T15:46:00.500 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 29/98 2026-03-25T15:46:00.515 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 30/98 2026-03-25T15:46:00.521 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-cryptography-36.0.1-5.el9.x86_64 31/98 2026-03-25T15:46:00.524 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : protobuf-compiler-3.14.0-17.el9.x86_64 32/98 2026-03-25T15:46:00.527 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-bcrypt-3.2.2-1.el9.x86_64 33/98 2026-03-25T15:46:00.549 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: rbd-mirror-2:20.2.0-712.g70f8415b.el9.x86_64 34/98 2026-03-25T15:46:00.549 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-25T15:46:00.549 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-25T15:46:00.549 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target". 2026-03-25T15:46:00.549 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target". 2026-03-25T15:46:00.549 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:46:00.550 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : rbd-mirror-2:20.2.0-712.g70f8415b.el9.x86_64 34/98 2026-03-25T15:46:00.559 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: rbd-mirror-2:20.2.0-712.g70f8415b.el9.x86_64 34/98 2026-03-25T15:46:00.562 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-jaraco-context-6.0.1-3.el9.noarch 35/98 2026-03-25T15:46:00.565 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-packaging-20.9-5.el9.noarch 36/98 2026-03-25T15:46:00.568 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-portend-3.1.0-2.el9.noarch 37/98 2026-03-25T15:46:00.571 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-tempora-5.0.0-2.el9.noarch 38/98 2026-03-25T15:46:00.575 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-jaraco-functools-3.5.0-2.el9.noarch 39/98 2026-03-25T15:46:00.579 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-routes-2.5.1-5.el9.noarch 40/98 2026-03-25T15:46:00.583 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-cffi-1.14.5-5.el9.x86_64 41/98 2026-03-25T15:46:00.634 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-pycparser-2.20-6.el9.noarch 42/98 2026-03-25T15:46:00.648 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-numpy-1:1.23.5-2.el9.x86_64 43/98 2026-03-25T15:46:00.651 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : flexiblas-netlib-3.0.4-9.el9.x86_64 44/98 2026-03-25T15:46:00.654 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 45/98 2026-03-25T15:46:00.656 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : openblas-openmp-0.3.29-1.el9.x86_64 46/98 2026-03-25T15:46:00.660 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libgfortran-11.5.0-14.el9.x86_64 47/98 2026-03-25T15:46:00.663 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 48/98 2026-03-25T15:46:00.685 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-immutable-object-cache-2:20.2.0-712.g70f8415b 49/98 2026-03-25T15:46:00.685 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-25T15:46:00.685 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-25T15:46:00.685 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:46:00.686 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-immutable-object-cache-2:20.2.0-712.g70f8415b 49/98 2026-03-25T15:46:00.695 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-immutable-object-cache-2:20.2.0-712.g70f8415b 49/98 2026-03-25T15:46:00.697 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : openblas-0.3.29-1.el9.x86_64 50/98 2026-03-25T15:46:00.699 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : flexiblas-3.0.4-9.el9.x86_64 51/98 2026-03-25T15:46:00.702 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-ply-3.11-14.el9.noarch 52/98 2026-03-25T15:46:00.705 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-repoze-lru-0.7-16.el9.noarch 53/98 2026-03-25T15:46:00.707 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-jaraco-8.2.1-3.el9.noarch 54/98 2026-03-25T15:46:00.710 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-more-itertools-8.12.0-2.el9.noarch 55/98 2026-03-25T15:46:00.713 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-toml-0.10.2-6.el9.noarch 56/98 2026-03-25T15:46:00.716 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-pytz-2021.1-5.el9.noarch 57/98 2026-03-25T15:46:00.719 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-pyparsing-2.4.7-9.el9.noarch 58/98 2026-03-25T15:46:00.727 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-backports-tarfile-1.2.0-1.el9.noarch 59/98 2026-03-25T15:46:00.731 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-devel-3.9.25-3.el9.x86_64 60/98 2026-03-25T15:46:00.733 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-jsonpointer-2.0-4.el9.noarch 61/98 2026-03-25T15:46:00.736 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-typing-extensions-4.15.0-1.el9.noarch 62/98 2026-03-25T15:46:00.738 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-idna-2.10-7.el9.1.noarch 63/98 2026-03-25T15:46:00.743 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-pysocks-1.7.1-12.el9.noarch 64/98 2026-03-25T15:46:00.747 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-pyasn1-0.4.8-7.el9.noarch 65/98 2026-03-25T15:46:00.752 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-cachetools-4.2.4-1.el9.noarch 66/98 2026-03-25T15:46:00.755 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-chardet-4.0.0-5.el9.noarch 67/98 2026-03-25T15:46:00.757 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-autocommand-2.2.2-8.el9.noarch 68/98 2026-03-25T15:46:00.763 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : grpc-data-1.46.7-10.el9.noarch 69/98 2026-03-25T15:46:00.766 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-protobuf-3.14.0-17.el9.noarch 70/98 2026-03-25T15:46:00.770 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-zc-lockfile-2.0-10.el9.noarch 71/98 2026-03-25T15:46:00.778 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-natsort-7.1.1-5.el9.noarch 72/98 2026-03-25T15:46:00.782 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-oauthlib-3.1.1-5.el9.noarch 73/98 2026-03-25T15:46:00.786 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-websocket-client-1.2.3-2.el9.noarch 74/98 2026-03-25T15:46:00.788 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-certifi-2023.05.07-4.el9.noarch 75/98 2026-03-25T15:46:00.790 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-grafana-dashboards-2:20.2.0-712.g70f8415b.el9 76/98 2026-03-25T15:46:00.791 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-prometheus-alerts-2:20.2.0-712.g70f8415b.el9. 77/98 2026-03-25T15:46:00.811 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-base-2:20.2.0-712.g70f8415b.el9.x86_64 78/98 2026-03-25T15:46:00.811 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph.target". 2026-03-25T15:46:00.812 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-crash.service". 2026-03-25T15:46:00.812 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:46:00.818 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-base-2:20.2.0-712.g70f8415b.el9.x86_64 78/98 2026-03-25T15:46:00.818 INFO:teuthology.orchestra.run.vm04.stdout:warning: file /etc/logrotate.d/ceph: remove failed: No such file or directory 2026-03-25T15:46:00.818 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:46:00.843 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-base-2:20.2.0-712.g70f8415b.el9.x86_64 78/98 2026-03-25T15:46:00.843 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-common-2:20.2.0-712.g70f8415b.el9.x86_64 79/98 2026-03-25T15:46:00.965 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-common-2:20.2.0-712.g70f8415b.el9.x86_64 79/98 2026-03-25T15:46:00.971 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : qatzip-libs-1.3.1-1.el9.x86_64 80/98 2026-03-25T15:46:00.974 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-ceph-common-2:20.2.0-712.g70f8415b.el9.x86 81/98 2026-03-25T15:46:00.976 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-prettytable-0.7.2-27.el9.noarch 82/98 2026-03-25T15:46:00.976 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-selinux-2:20.2.0-712.g70f8415b.el9.x86_64 83/98 2026-03-25T15:46:06.712 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-selinux-2:20.2.0-712.g70f8415b.el9.x86_64 83/98 2026-03-25T15:46:06.712 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /sys 2026-03-25T15:46:06.712 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /proc 2026-03-25T15:46:06.712 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /mnt 2026-03-25T15:46:06.712 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /var/tmp 2026-03-25T15:46:06.712 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /home 2026-03-25T15:46:06.712 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /root 2026-03-25T15:46:06.712 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /tmp 2026-03-25T15:46:06.712 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:46:06.723 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : qatlib-25.08.0-2.el9.x86_64 84/98 2026-03-25T15:46:06.742 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: qatlib-service-25.08.0-2.el9.x86_64 85/98 2026-03-25T15:46:06.742 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : qatlib-service-25.08.0-2.el9.x86_64 85/98 2026-03-25T15:46:06.749 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: qatlib-service-25.08.0-2.el9.x86_64 85/98 2026-03-25T15:46:06.752 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : gperftools-libs-2.9.1-3.el9.x86_64 86/98 2026-03-25T15:46:06.754 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libunwind-1.6.2-1.el9.x86_64 87/98 2026-03-25T15:46:06.756 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : pciutils-3.7.0-7.el9.x86_64 88/98 2026-03-25T15:46:06.759 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : liboath-2.6.12-1.el9.x86_64 89/98 2026-03-25T15:46:06.759 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libradosstriper1-2:20.2.0-712.g70f8415b.el9.x86_64 90/98 2026-03-25T15:46:06.772 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: libradosstriper1-2:20.2.0-712.g70f8415b.el9.x86_64 90/98 2026-03-25T15:46:06.774 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ledmon-libs-1.1.0-3.el9.x86_64 91/98 2026-03-25T15:46:06.776 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libquadmath-11.5.0-14.el9.x86_64 92/98 2026-03-25T15:46:06.778 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : protobuf-3.14.0-17.el9.x86_64 93/98 2026-03-25T15:46:06.782 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libconfig-1.7.2-9.el9.x86_64 94/98 2026-03-25T15:46:06.787 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-markupsafe-1.1.1-12.el9.x86_64 95/98 2026-03-25T15:46:06.795 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : cryptsetup-2.8.1-3.el9.x86_64 96/98 2026-03-25T15:46:06.799 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : abseil-cpp-20211102.0-4.el9.x86_64 97/98 2026-03-25T15:46:06.799 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libcephsqlite-2:20.2.0-712.g70f8415b.el9.x86_64 98/98 2026-03-25T15:46:06.887 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: libcephsqlite-2:20.2.0-712.g70f8415b.el9.x86_64 98/98 2026-03-25T15:46:06.887 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : abseil-cpp-20211102.0-4.el9.x86_64 1/98 2026-03-25T15:46:06.887 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-base-2:20.2.0-712.g70f8415b.el9.x86_64 2/98 2026-03-25T15:46:06.887 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-common-2:20.2.0-712.g70f8415b.el9.x86_64 3/98 2026-03-25T15:46:06.887 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-grafana-dashboards-2:20.2.0-712.g70f8415b.el9 4/98 2026-03-25T15:46:06.887 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-immutable-object-cache-2:20.2.0-712.g70f8415b 5/98 2026-03-25T15:46:06.887 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-2:20.2.0-712.g70f8415b.el9.x86_64 6/98 2026-03-25T15:46:06.887 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-cephadm-2:20.2.0-712.g70f8415b.el9.noarch 7/98 2026-03-25T15:46:06.887 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-dashboard-2:20.2.0-712.g70f8415b.el9.noar 8/98 2026-03-25T15:46:06.887 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-diskprediction-local-2:20.2.0-712.g70f841 9/98 2026-03-25T15:46:06.887 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-modules-core-2:20.2.0-712.g70f8415b.el9.n 10/98 2026-03-25T15:46:06.887 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-rook-2:20.2.0-712.g70f8415b.el9.noarch 11/98 2026-03-25T15:46:06.887 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-osd-2:20.2.0-712.g70f8415b.el9.x86_64 12/98 2026-03-25T15:46:06.887 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-prometheus-alerts-2:20.2.0-712.g70f8415b.el9. 13/98 2026-03-25T15:46:06.887 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-selinux-2:20.2.0-712.g70f8415b.el9.x86_64 14/98 2026-03-25T15:46:06.887 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-volume-2:20.2.0-712.g70f8415b.el9.noarch 15/98 2026-03-25T15:46:06.887 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : cryptsetup-2.8.1-3.el9.x86_64 16/98 2026-03-25T15:46:06.887 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 17/98 2026-03-25T15:46:06.887 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 18/98 2026-03-25T15:46:06.887 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 19/98 2026-03-25T15:46:06.887 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 20/98 2026-03-25T15:46:06.887 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : grpc-data-1.46.7-10.el9.noarch 21/98 2026-03-25T15:46:06.887 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 22/98 2026-03-25T15:46:06.887 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libcephsqlite-2:20.2.0-712.g70f8415b.el9.x86_64 23/98 2026-03-25T15:46:06.887 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 24/98 2026-03-25T15:46:06.887 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 25/98 2026-03-25T15:46:06.887 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 26/98 2026-03-25T15:46:06.887 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 27/98 2026-03-25T15:46:06.887 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libradosstriper1-2:20.2.0-712.g70f8415b.el9.x86_64 28/98 2026-03-25T15:46:06.887 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 29/98 2026-03-25T15:46:06.887 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 30/98 2026-03-25T15:46:06.887 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 31/98 2026-03-25T15:46:06.887 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 32/98 2026-03-25T15:46:06.887 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : pciutils-3.7.0-7.el9.x86_64 33/98 2026-03-25T15:46:06.887 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : protobuf-3.14.0-17.el9.x86_64 34/98 2026-03-25T15:46:06.887 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : protobuf-compiler-3.14.0-17.el9.x86_64 35/98 2026-03-25T15:46:06.887 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 36/98 2026-03-25T15:46:06.887 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 37/98 2026-03-25T15:46:06.887 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 38/98 2026-03-25T15:46:06.887 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 39/98 2026-03-25T15:46:06.887 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 40/98 2026-03-25T15:46:06.887 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 41/98 2026-03-25T15:46:06.887 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-ceph-common-2:20.2.0-712.g70f8415b.el9.x86 42/98 2026-03-25T15:46:06.887 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 43/98 2026-03-25T15:46:06.887 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 44/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-chardet-4.0.0-5.el9.noarch 45/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cheroot-10.0.1-4.el9.noarch 46/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cherrypy-18.6.1-2.el9.noarch 47/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 48/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 49/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 50/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-grpcio-1.46.7-10.el9.x86_64 51/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-grpcio-tools-1.46.7-10.el9.x86_64 52/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-idna-2.10-7.el9.1.noarch 53/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 54/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 55/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 56/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 57/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 58/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 59/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 60/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jsonpatch-1.21-16.el9.noarch 61/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jsonpointer-2.0-4.el9.noarch 62/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 63/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 64/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 65/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 66/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 67/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 68/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 69/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-oauthlib-3.1.1-5.el9.noarch 70/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-packaging-20.9-5.el9.noarch 71/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-ply-3.11-14.el9.noarch 72/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 73/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-prettytable-0.7.2-27.el9.noarch 74/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-protobuf-3.14.0-17.el9.noarch 75/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 76/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 77/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 78/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 79/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pyparsing-2.4.7-9.el9.noarch 80/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pysocks-1.7.1-12.el9.noarch 81/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pytz-2021.1-5.el9.noarch 82/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 83/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 84/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 85/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 86/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 87/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 88/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 89/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 90/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 91/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 92/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 93/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 94/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : qatlib-25.08.0-2.el9.x86_64 95/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : qatlib-service-25.08.0-2.el9.x86_64 96/98 2026-03-25T15:46:06.888 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : qatzip-libs-1.3.1-1.el9.x86_64 97/98 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : rbd-mirror-2:20.2.0-712.g70f8415b.el9.x86_64 98/98 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout:Removed: 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: abseil-cpp-20211102.0-4.el9.x86_64 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: ceph-base-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: ceph-common-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: ceph-grafana-dashboards-2:20.2.0-712.g70f8415b.el9.noarch 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: ceph-immutable-object-cache-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-cephadm-2:20.2.0-712.g70f8415b.el9.noarch 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-dashboard-2:20.2.0-712.g70f8415b.el9.noarch 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-diskprediction-local-2:20.2.0-712.g70f8415b.el9.noarch 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-modules-core-2:20.2.0-712.g70f8415b.el9.noarch 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-rook-2:20.2.0-712.g70f8415b.el9.noarch 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: ceph-osd-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: ceph-prometheus-alerts-2:20.2.0-712.g70f8415b.el9.noarch 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: ceph-selinux-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: ceph-volume-2:20.2.0-712.g70f8415b.el9.noarch 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: cryptsetup-2.8.1-3.el9.x86_64 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: grpc-data-1.46.7-10.el9.noarch 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: ledmon-libs-1.1.0-3.el9.x86_64 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: libcephsqlite-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: libconfig-1.7.2-9.el9.x86_64 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: libradosstriper1-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: pciutils-3.7.0-7.el9.x86_64 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: protobuf-3.14.0-17.el9.x86_64 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: protobuf-compiler-3.14.0-17.el9.x86_64 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: python3-ceph-common-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-25T15:46:06.969 INFO:teuthology.orchestra.run.vm04.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-chardet-4.0.0-5.el9.noarch 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-cheroot-10.0.1-4.el9.noarch 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-cherrypy-18.6.1-2.el9.noarch 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-grpcio-1.46.7-10.el9.x86_64 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-grpcio-tools-1.46.7-10.el9.x86_64 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-idna-2.10-7.el9.1.noarch 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-jsonpatch-1.21-16.el9.noarch 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-jsonpointer-2.0-4.el9.noarch 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-oauthlib-3.1.1-5.el9.noarch 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-packaging-20.9-5.el9.noarch 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-ply-3.11-14.el9.noarch 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-prettytable-0.7.2-27.el9.noarch 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-protobuf-3.14.0-17.el9.noarch 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyparsing-2.4.7-9.el9.noarch 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-pysocks-1.7.1-12.el9.noarch 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-pytz-2021.1-5.el9.noarch 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-25T15:46:06.970 INFO:teuthology.orchestra.run.vm04.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-25T15:46:06.971 INFO:teuthology.orchestra.run.vm04.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-25T15:46:06.971 INFO:teuthology.orchestra.run.vm04.stdout: qatlib-25.08.0-2.el9.x86_64 2026-03-25T15:46:06.971 INFO:teuthology.orchestra.run.vm04.stdout: qatlib-service-25.08.0-2.el9.x86_64 2026-03-25T15:46:06.971 INFO:teuthology.orchestra.run.vm04.stdout: qatzip-libs-1.3.1-1.el9.x86_64 2026-03-25T15:46:06.971 INFO:teuthology.orchestra.run.vm04.stdout: rbd-mirror-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:46:06.971 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:46:06.971 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-25T15:46:07.168 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-25T15:46:07.168 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-25T15:46:07.168 INFO:teuthology.orchestra.run.vm04.stdout: Package Arch Version Repository Size 2026-03-25T15:46:07.168 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-25T15:46:07.168 INFO:teuthology.orchestra.run.vm04.stdout:Removing: 2026-03-25T15:46:07.168 INFO:teuthology.orchestra.run.vm04.stdout: cephadm noarch 2:20.2.0-712.g70f8415b.el9 @ceph-noarch 1.0 M 2026-03-25T15:46:07.168 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:46:07.168 INFO:teuthology.orchestra.run.vm04.stdout:Transaction Summary 2026-03-25T15:46:07.168 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-25T15:46:07.168 INFO:teuthology.orchestra.run.vm04.stdout:Remove 1 Package 2026-03-25T15:46:07.168 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:46:07.168 INFO:teuthology.orchestra.run.vm04.stdout:Freed space: 1.0 M 2026-03-25T15:46:07.168 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction check 2026-03-25T15:46:07.170 INFO:teuthology.orchestra.run.vm04.stdout:Transaction check succeeded. 2026-03-25T15:46:07.170 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction test 2026-03-25T15:46:07.171 INFO:teuthology.orchestra.run.vm04.stdout:Transaction test succeeded. 2026-03-25T15:46:07.171 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction 2026-03-25T15:46:07.187 INFO:teuthology.orchestra.run.vm04.stdout: Preparing : 1/1 2026-03-25T15:46:07.187 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : cephadm-2:20.2.0-712.g70f8415b.el9.noarch 1/1 2026-03-25T15:46:07.290 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: cephadm-2:20.2.0-712.g70f8415b.el9.noarch 1/1 2026-03-25T15:46:07.328 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : cephadm-2:20.2.0-712.g70f8415b.el9.noarch 1/1 2026-03-25T15:46:07.328 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:46:07.328 INFO:teuthology.orchestra.run.vm04.stdout:Removed: 2026-03-25T15:46:07.328 INFO:teuthology.orchestra.run.vm04.stdout: cephadm-2:20.2.0-712.g70f8415b.el9.noarch 2026-03-25T15:46:07.328 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:46:07.328 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-25T15:46:07.503 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: ceph-immutable-object-cache 2026-03-25T15:46:07.503 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-25T15:46:07.506 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-25T15:46:07.506 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-25T15:46:07.506 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-25T15:46:07.674 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: ceph-mgr 2026-03-25T15:46:07.674 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-25T15:46:07.677 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-25T15:46:07.678 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-25T15:46:07.678 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-25T15:46:07.850 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: ceph-mgr-dashboard 2026-03-25T15:46:07.850 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-25T15:46:07.854 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-25T15:46:07.854 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-25T15:46:07.854 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-25T15:46:08.024 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: ceph-mgr-diskprediction-local 2026-03-25T15:46:08.024 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-25T15:46:08.027 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-25T15:46:08.028 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-25T15:46:08.028 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-25T15:46:08.190 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: ceph-mgr-rook 2026-03-25T15:46:08.190 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-25T15:46:08.193 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-25T15:46:08.194 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-25T15:46:08.194 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-25T15:46:08.361 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: ceph-mgr-cephadm 2026-03-25T15:46:08.361 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-25T15:46:08.364 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-25T15:46:08.365 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-25T15:46:08.365 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-25T15:46:08.543 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-25T15:46:08.543 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-25T15:46:08.543 INFO:teuthology.orchestra.run.vm04.stdout: Package Arch Version Repository Size 2026-03-25T15:46:08.543 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-25T15:46:08.543 INFO:teuthology.orchestra.run.vm04.stdout:Removing: 2026-03-25T15:46:08.543 INFO:teuthology.orchestra.run.vm04.stdout: ceph-fuse x86_64 2:20.2.0-712.g70f8415b.el9 @ceph 2.7 M 2026-03-25T15:46:08.543 INFO:teuthology.orchestra.run.vm04.stdout:Removing unused dependencies: 2026-03-25T15:46:08.543 INFO:teuthology.orchestra.run.vm04.stdout: fuse x86_64 2.9.9-17.el9 @baseos 214 k 2026-03-25T15:46:08.543 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:46:08.543 INFO:teuthology.orchestra.run.vm04.stdout:Transaction Summary 2026-03-25T15:46:08.543 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-25T15:46:08.543 INFO:teuthology.orchestra.run.vm04.stdout:Remove 2 Packages 2026-03-25T15:46:08.543 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:46:08.543 INFO:teuthology.orchestra.run.vm04.stdout:Freed space: 2.9 M 2026-03-25T15:46:08.543 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction check 2026-03-25T15:46:08.546 INFO:teuthology.orchestra.run.vm04.stdout:Transaction check succeeded. 2026-03-25T15:46:08.546 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction test 2026-03-25T15:46:08.560 INFO:teuthology.orchestra.run.vm04.stdout:Transaction test succeeded. 2026-03-25T15:46:08.560 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction 2026-03-25T15:46:08.590 INFO:teuthology.orchestra.run.vm04.stdout: Preparing : 1/1 2026-03-25T15:46:08.594 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-fuse-2:20.2.0-712.g70f8415b.el9.x86_64 1/2 2026-03-25T15:46:08.607 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : fuse-2.9.9-17.el9.x86_64 2/2 2026-03-25T15:46:08.676 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: fuse-2.9.9-17.el9.x86_64 2/2 2026-03-25T15:46:08.676 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-fuse-2:20.2.0-712.g70f8415b.el9.x86_64 1/2 2026-03-25T15:46:08.716 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : fuse-2.9.9-17.el9.x86_64 2/2 2026-03-25T15:46:08.716 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:46:08.716 INFO:teuthology.orchestra.run.vm04.stdout:Removed: 2026-03-25T15:46:08.716 INFO:teuthology.orchestra.run.vm04.stdout: ceph-fuse-2:20.2.0-712.g70f8415b.el9.x86_64 fuse-2.9.9-17.el9.x86_64 2026-03-25T15:46:08.716 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:46:08.716 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-25T15:46:08.888 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: ceph-volume 2026-03-25T15:46:08.888 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-25T15:46:08.891 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-25T15:46:08.892 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-25T15:46:08.892 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-25T15:46:09.070 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-25T15:46:09.071 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-25T15:46:09.071 INFO:teuthology.orchestra.run.vm04.stdout: Package Arch Version Repo Size 2026-03-25T15:46:09.071 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-25T15:46:09.071 INFO:teuthology.orchestra.run.vm04.stdout:Removing: 2026-03-25T15:46:09.071 INFO:teuthology.orchestra.run.vm04.stdout: librados-devel x86_64 2:20.2.0-712.g70f8415b.el9 @ceph 449 k 2026-03-25T15:46:09.071 INFO:teuthology.orchestra.run.vm04.stdout:Removing dependent packages: 2026-03-25T15:46:09.071 INFO:teuthology.orchestra.run.vm04.stdout: libcephfs-devel x86_64 2:20.2.0-712.g70f8415b.el9 @ceph 155 k 2026-03-25T15:46:09.071 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:46:09.071 INFO:teuthology.orchestra.run.vm04.stdout:Transaction Summary 2026-03-25T15:46:09.071 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-25T15:46:09.071 INFO:teuthology.orchestra.run.vm04.stdout:Remove 2 Packages 2026-03-25T15:46:09.071 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:46:09.071 INFO:teuthology.orchestra.run.vm04.stdout:Freed space: 604 k 2026-03-25T15:46:09.071 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction check 2026-03-25T15:46:09.073 INFO:teuthology.orchestra.run.vm04.stdout:Transaction check succeeded. 2026-03-25T15:46:09.073 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction test 2026-03-25T15:46:09.083 INFO:teuthology.orchestra.run.vm04.stdout:Transaction test succeeded. 2026-03-25T15:46:09.083 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction 2026-03-25T15:46:09.109 INFO:teuthology.orchestra.run.vm04.stdout: Preparing : 1/1 2026-03-25T15:46:09.111 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libcephfs-devel-2:20.2.0-712.g70f8415b.el9.x86_64 1/2 2026-03-25T15:46:09.124 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : librados-devel-2:20.2.0-712.g70f8415b.el9.x86_64 2/2 2026-03-25T15:46:09.181 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: librados-devel-2:20.2.0-712.g70f8415b.el9.x86_64 2/2 2026-03-25T15:46:09.182 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libcephfs-devel-2:20.2.0-712.g70f8415b.el9.x86_64 1/2 2026-03-25T15:46:09.220 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librados-devel-2:20.2.0-712.g70f8415b.el9.x86_64 2/2 2026-03-25T15:46:09.220 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:46:09.220 INFO:teuthology.orchestra.run.vm04.stdout:Removed: 2026-03-25T15:46:09.220 INFO:teuthology.orchestra.run.vm04.stdout: libcephfs-devel-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:46:09.220 INFO:teuthology.orchestra.run.vm04.stdout: librados-devel-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:46:09.220 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:46:09.220 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-25T15:46:09.414 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-25T15:46:09.414 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-25T15:46:09.414 INFO:teuthology.orchestra.run.vm04.stdout: Package Arch Version Repo Size 2026-03-25T15:46:09.414 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-25T15:46:09.414 INFO:teuthology.orchestra.run.vm04.stdout:Removing: 2026-03-25T15:46:09.414 INFO:teuthology.orchestra.run.vm04.stdout: libcephfs2 x86_64 2:20.2.0-712.g70f8415b.el9 @ceph 2.4 M 2026-03-25T15:46:09.414 INFO:teuthology.orchestra.run.vm04.stdout:Removing dependent packages: 2026-03-25T15:46:09.414 INFO:teuthology.orchestra.run.vm04.stdout: python3-cephfs x86_64 2:20.2.0-712.g70f8415b.el9 @ceph 510 k 2026-03-25T15:46:09.414 INFO:teuthology.orchestra.run.vm04.stdout:Removing unused dependencies: 2026-03-25T15:46:09.414 INFO:teuthology.orchestra.run.vm04.stdout: libcephfs-proxy2 x86_64 2:20.2.0-712.g70f8415b.el9 @ceph 52 k 2026-03-25T15:46:09.414 INFO:teuthology.orchestra.run.vm04.stdout: python3-ceph-argparse x86_64 2:20.2.0-712.g70f8415b.el9 @ceph 187 k 2026-03-25T15:46:09.414 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:46:09.414 INFO:teuthology.orchestra.run.vm04.stdout:Transaction Summary 2026-03-25T15:46:09.414 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-25T15:46:09.414 INFO:teuthology.orchestra.run.vm04.stdout:Remove 4 Packages 2026-03-25T15:46:09.414 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:46:09.414 INFO:teuthology.orchestra.run.vm04.stdout:Freed space: 3.2 M 2026-03-25T15:46:09.414 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction check 2026-03-25T15:46:09.416 INFO:teuthology.orchestra.run.vm04.stdout:Transaction check succeeded. 2026-03-25T15:46:09.416 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction test 2026-03-25T15:46:09.428 INFO:teuthology.orchestra.run.vm04.stdout:Transaction test succeeded. 2026-03-25T15:46:09.428 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction 2026-03-25T15:46:09.456 INFO:teuthology.orchestra.run.vm04.stdout: Preparing : 1/1 2026-03-25T15:46:09.459 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-cephfs-2:20.2.0-712.g70f8415b.el9.x86_64 1/4 2026-03-25T15:46:09.460 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-ceph-argparse-2:20.2.0-712.g70f8415b.el9.x86 2/4 2026-03-25T15:46:09.461 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libcephfs2-2:20.2.0-712.g70f8415b.el9.x86_64 3/4 2026-03-25T15:46:09.475 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: libcephfs2-2:20.2.0-712.g70f8415b.el9.x86_64 3/4 2026-03-25T15:46:09.475 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libcephfs-proxy2-2:20.2.0-712.g70f8415b.el9.x86_64 4/4 2026-03-25T15:46:09.537 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: libcephfs-proxy2-2:20.2.0-712.g70f8415b.el9.x86_64 4/4 2026-03-25T15:46:09.537 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libcephfs-proxy2-2:20.2.0-712.g70f8415b.el9.x86_64 1/4 2026-03-25T15:46:09.537 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libcephfs2-2:20.2.0-712.g70f8415b.el9.x86_64 2/4 2026-03-25T15:46:09.537 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-ceph-argparse-2:20.2.0-712.g70f8415b.el9.x86 3/4 2026-03-25T15:46:09.581 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cephfs-2:20.2.0-712.g70f8415b.el9.x86_64 4/4 2026-03-25T15:46:09.581 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:46:09.581 INFO:teuthology.orchestra.run.vm04.stdout:Removed: 2026-03-25T15:46:09.581 INFO:teuthology.orchestra.run.vm04.stdout: libcephfs-proxy2-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:46:09.581 INFO:teuthology.orchestra.run.vm04.stdout: libcephfs2-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:46:09.581 INFO:teuthology.orchestra.run.vm04.stdout: python3-ceph-argparse-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:46:09.581 INFO:teuthology.orchestra.run.vm04.stdout: python3-cephfs-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:46:09.581 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:46:09.581 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-25T15:46:09.758 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: libcephfs-devel 2026-03-25T15:46:09.758 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-25T15:46:09.761 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-25T15:46:09.762 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-25T15:46:09.762 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-25T15:46:09.948 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-25T15:46:09.950 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-25T15:46:09.950 INFO:teuthology.orchestra.run.vm04.stdout: Package Arch Version Repository Size 2026-03-25T15:46:09.950 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-25T15:46:09.950 INFO:teuthology.orchestra.run.vm04.stdout:Removing: 2026-03-25T15:46:09.950 INFO:teuthology.orchestra.run.vm04.stdout: librados2 x86_64 2:20.2.0-712.g70f8415b.el9 @ceph 12 M 2026-03-25T15:46:09.950 INFO:teuthology.orchestra.run.vm04.stdout:Removing dependent packages: 2026-03-25T15:46:09.950 INFO:teuthology.orchestra.run.vm04.stdout: python3-rados x86_64 2:20.2.0-712.g70f8415b.el9 @ceph 1.1 M 2026-03-25T15:46:09.950 INFO:teuthology.orchestra.run.vm04.stdout: python3-rbd x86_64 2:20.2.0-712.g70f8415b.el9 @ceph 1.1 M 2026-03-25T15:46:09.950 INFO:teuthology.orchestra.run.vm04.stdout: python3-rgw x86_64 2:20.2.0-712.g70f8415b.el9 @ceph 264 k 2026-03-25T15:46:09.950 INFO:teuthology.orchestra.run.vm04.stdout: qemu-kvm-block-rbd x86_64 17:10.1.0-16.el9 @appstream 37 k 2026-03-25T15:46:09.950 INFO:teuthology.orchestra.run.vm04.stdout: rbd-fuse x86_64 2:20.2.0-712.g70f8415b.el9 @ceph 238 k 2026-03-25T15:46:09.950 INFO:teuthology.orchestra.run.vm04.stdout: rbd-nbd x86_64 2:20.2.0-712.g70f8415b.el9 @ceph 498 k 2026-03-25T15:46:09.950 INFO:teuthology.orchestra.run.vm04.stdout:Removing unused dependencies: 2026-03-25T15:46:09.950 INFO:teuthology.orchestra.run.vm04.stdout: boost-program-options x86_64 1.75.0-13.el9 @appstream 276 k 2026-03-25T15:46:09.950 INFO:teuthology.orchestra.run.vm04.stdout: libarrow x86_64 9.0.0-15.el9 @epel 18 M 2026-03-25T15:46:09.950 INFO:teuthology.orchestra.run.vm04.stdout: libarrow-doc noarch 9.0.0-15.el9 @epel 122 k 2026-03-25T15:46:09.950 INFO:teuthology.orchestra.run.vm04.stdout: libnbd x86_64 1.20.3-4.el9 @appstream 453 k 2026-03-25T15:46:09.950 INFO:teuthology.orchestra.run.vm04.stdout: libpmemobj x86_64 1.12.1-1.el9 @appstream 383 k 2026-03-25T15:46:09.950 INFO:teuthology.orchestra.run.vm04.stdout: librabbitmq x86_64 0.11.0-7.el9 @appstream 102 k 2026-03-25T15:46:09.950 INFO:teuthology.orchestra.run.vm04.stdout: librbd1 x86_64 2:20.2.0-712.g70f8415b.el9 @ceph 10 M 2026-03-25T15:46:09.950 INFO:teuthology.orchestra.run.vm04.stdout: librdkafka x86_64 1.6.1-102.el9 @appstream 2.0 M 2026-03-25T15:46:09.950 INFO:teuthology.orchestra.run.vm04.stdout: librgw2 x86_64 2:20.2.0-712.g70f8415b.el9 @ceph 28 M 2026-03-25T15:46:09.950 INFO:teuthology.orchestra.run.vm04.stdout: lttng-ust x86_64 2.12.0-6.el9 @appstream 1.0 M 2026-03-25T15:46:09.950 INFO:teuthology.orchestra.run.vm04.stdout: parquet-libs x86_64 9.0.0-15.el9 @epel 2.8 M 2026-03-25T15:46:09.950 INFO:teuthology.orchestra.run.vm04.stdout: re2 x86_64 1:20211101-20.el9 @epel 472 k 2026-03-25T15:46:09.950 INFO:teuthology.orchestra.run.vm04.stdout: thrift x86_64 0.15.0-4.el9 @epel 4.8 M 2026-03-25T15:46:09.950 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:46:09.951 INFO:teuthology.orchestra.run.vm04.stdout:Transaction Summary 2026-03-25T15:46:09.951 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-25T15:46:09.951 INFO:teuthology.orchestra.run.vm04.stdout:Remove 20 Packages 2026-03-25T15:46:09.951 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:46:09.951 INFO:teuthology.orchestra.run.vm04.stdout:Freed space: 84 M 2026-03-25T15:46:09.951 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction check 2026-03-25T15:46:09.954 INFO:teuthology.orchestra.run.vm04.stdout:Transaction check succeeded. 2026-03-25T15:46:09.954 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction test 2026-03-25T15:46:09.978 INFO:teuthology.orchestra.run.vm04.stdout:Transaction test succeeded. 2026-03-25T15:46:09.978 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction 2026-03-25T15:46:10.022 INFO:teuthology.orchestra.run.vm04.stdout: Preparing : 1/1 2026-03-25T15:46:10.026 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : rbd-nbd-2:20.2.0-712.g70f8415b.el9.x86_64 1/20 2026-03-25T15:46:10.028 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : rbd-fuse-2:20.2.0-712.g70f8415b.el9.x86_64 2/20 2026-03-25T15:46:10.030 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-rgw-2:20.2.0-712.g70f8415b.el9.x86_64 3/20 2026-03-25T15:46:10.030 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : librgw2-2:20.2.0-712.g70f8415b.el9.x86_64 4/20 2026-03-25T15:46:10.045 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: librgw2-2:20.2.0-712.g70f8415b.el9.x86_64 4/20 2026-03-25T15:46:10.048 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : parquet-libs-9.0.0-15.el9.x86_64 5/20 2026-03-25T15:46:10.050 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-rbd-2:20.2.0-712.g70f8415b.el9.x86_64 6/20 2026-03-25T15:46:10.052 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-rados-2:20.2.0-712.g70f8415b.el9.x86_64 7/20 2026-03-25T15:46:10.053 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : qemu-kvm-block-rbd-17:10.1.0-16.el9.x86_64 8/20 2026-03-25T15:46:10.056 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libarrow-doc-9.0.0-15.el9.noarch 9/20 2026-03-25T15:46:10.056 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : librbd1-2:20.2.0-712.g70f8415b.el9.x86_64 10/20 2026-03-25T15:46:10.070 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: librbd1-2:20.2.0-712.g70f8415b.el9.x86_64 10/20 2026-03-25T15:46:10.070 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : librados2-2:20.2.0-712.g70f8415b.el9.x86_64 11/20 2026-03-25T15:46:10.070 INFO:teuthology.orchestra.run.vm04.stdout:warning: file /etc/ceph: remove failed: No such file or directory 2026-03-25T15:46:10.070 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:46:10.084 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: librados2-2:20.2.0-712.g70f8415b.el9.x86_64 11/20 2026-03-25T15:46:10.086 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libarrow-9.0.0-15.el9.x86_64 12/20 2026-03-25T15:46:10.089 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : re2-1:20211101-20.el9.x86_64 13/20 2026-03-25T15:46:10.093 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : lttng-ust-2.12.0-6.el9.x86_64 14/20 2026-03-25T15:46:10.095 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : thrift-0.15.0-4.el9.x86_64 15/20 2026-03-25T15:46:10.098 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libnbd-1.20.3-4.el9.x86_64 16/20 2026-03-25T15:46:10.100 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libpmemobj-1.12.1-1.el9.x86_64 17/20 2026-03-25T15:46:10.103 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : boost-program-options-1.75.0-13.el9.x86_64 18/20 2026-03-25T15:46:10.105 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : librabbitmq-0.11.0-7.el9.x86_64 19/20 2026-03-25T15:46:10.120 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : librdkafka-1.6.1-102.el9.x86_64 20/20 2026-03-25T15:46:10.189 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: librdkafka-1.6.1-102.el9.x86_64 20/20 2026-03-25T15:46:10.189 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 1/20 2026-03-25T15:46:10.190 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 2/20 2026-03-25T15:46:10.190 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 3/20 2026-03-25T15:46:10.190 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libnbd-1.20.3-4.el9.x86_64 4/20 2026-03-25T15:46:10.190 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 5/20 2026-03-25T15:46:10.190 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 6/20 2026-03-25T15:46:10.190 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librados2-2:20.2.0-712.g70f8415b.el9.x86_64 7/20 2026-03-25T15:46:10.190 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librbd1-2:20.2.0-712.g70f8415b.el9.x86_64 8/20 2026-03-25T15:46:10.190 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 9/20 2026-03-25T15:46:10.190 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librgw2-2:20.2.0-712.g70f8415b.el9.x86_64 10/20 2026-03-25T15:46:10.190 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 11/20 2026-03-25T15:46:10.190 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 12/20 2026-03-25T15:46:10.190 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-rados-2:20.2.0-712.g70f8415b.el9.x86_64 13/20 2026-03-25T15:46:10.190 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-rbd-2:20.2.0-712.g70f8415b.el9.x86_64 14/20 2026-03-25T15:46:10.190 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-rgw-2:20.2.0-712.g70f8415b.el9.x86_64 15/20 2026-03-25T15:46:10.190 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : qemu-kvm-block-rbd-17:10.1.0-16.el9.x86_64 16/20 2026-03-25T15:46:10.190 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : rbd-fuse-2:20.2.0-712.g70f8415b.el9.x86_64 17/20 2026-03-25T15:46:10.190 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : rbd-nbd-2:20.2.0-712.g70f8415b.el9.x86_64 18/20 2026-03-25T15:46:10.190 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : re2-1:20211101-20.el9.x86_64 19/20 2026-03-25T15:46:10.241 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 20/20 2026-03-25T15:46:10.241 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:46:10.241 INFO:teuthology.orchestra.run.vm04.stdout:Removed: 2026-03-25T15:46:10.241 INFO:teuthology.orchestra.run.vm04.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-25T15:46:10.241 INFO:teuthology.orchestra.run.vm04.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-25T15:46:10.241 INFO:teuthology.orchestra.run.vm04.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-25T15:46:10.241 INFO:teuthology.orchestra.run.vm04.stdout: libnbd-1.20.3-4.el9.x86_64 2026-03-25T15:46:10.241 INFO:teuthology.orchestra.run.vm04.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-25T15:46:10.241 INFO:teuthology.orchestra.run.vm04.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-25T15:46:10.241 INFO:teuthology.orchestra.run.vm04.stdout: librados2-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:46:10.241 INFO:teuthology.orchestra.run.vm04.stdout: librbd1-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:46:10.241 INFO:teuthology.orchestra.run.vm04.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-25T15:46:10.241 INFO:teuthology.orchestra.run.vm04.stdout: librgw2-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:46:10.241 INFO:teuthology.orchestra.run.vm04.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-25T15:46:10.241 INFO:teuthology.orchestra.run.vm04.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-25T15:46:10.241 INFO:teuthology.orchestra.run.vm04.stdout: python3-rados-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:46:10.241 INFO:teuthology.orchestra.run.vm04.stdout: python3-rbd-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:46:10.241 INFO:teuthology.orchestra.run.vm04.stdout: python3-rgw-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:46:10.241 INFO:teuthology.orchestra.run.vm04.stdout: qemu-kvm-block-rbd-17:10.1.0-16.el9.x86_64 2026-03-25T15:46:10.241 INFO:teuthology.orchestra.run.vm04.stdout: rbd-fuse-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:46:10.241 INFO:teuthology.orchestra.run.vm04.stdout: rbd-nbd-2:20.2.0-712.g70f8415b.el9.x86_64 2026-03-25T15:46:10.241 INFO:teuthology.orchestra.run.vm04.stdout: re2-1:20211101-20.el9.x86_64 2026-03-25T15:46:10.241 INFO:teuthology.orchestra.run.vm04.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-25T15:46:10.241 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-25T15:46:10.241 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-25T15:46:10.416 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: librbd1 2026-03-25T15:46:10.417 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-25T15:46:10.420 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-25T15:46:10.421 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-25T15:46:10.421 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-25T15:46:10.610 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: python3-rados 2026-03-25T15:46:10.610 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-25T15:46:10.613 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-25T15:46:10.613 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-25T15:46:10.614 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-25T15:46:10.780 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: python3-rgw 2026-03-25T15:46:10.780 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-25T15:46:10.783 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-25T15:46:10.784 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-25T15:46:10.784 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-25T15:46:10.945 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: python3-cephfs 2026-03-25T15:46:10.945 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-25T15:46:10.949 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-25T15:46:10.949 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-25T15:46:10.949 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-25T15:46:11.115 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: python3-rbd 2026-03-25T15:46:11.115 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-25T15:46:11.119 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-25T15:46:11.119 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-25T15:46:11.119 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-25T15:46:11.292 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: rbd-fuse 2026-03-25T15:46:11.292 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-25T15:46:11.295 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-25T15:46:11.296 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-25T15:46:11.296 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-25T15:46:11.474 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: rbd-mirror 2026-03-25T15:46:11.474 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-25T15:46:11.479 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-25T15:46:11.479 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-25T15:46:11.480 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-25T15:46:11.655 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: rbd-nbd 2026-03-25T15:46:11.655 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-25T15:46:11.658 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-25T15:46:11.659 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-25T15:46:11.659 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-25T15:46:11.680 DEBUG:teuthology.orchestra.run.vm04:> sudo yum clean all 2026-03-25T15:46:11.805 INFO:teuthology.orchestra.run.vm04.stdout:56 files removed 2026-03-25T15:46:11.823 DEBUG:teuthology.orchestra.run.vm04:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-25T15:46:11.849 DEBUG:teuthology.orchestra.run.vm04:> sudo yum clean expire-cache 2026-03-25T15:46:12.001 INFO:teuthology.orchestra.run.vm04.stdout:Cache was expired 2026-03-25T15:46:12.001 INFO:teuthology.orchestra.run.vm04.stdout:0 files removed 2026-03-25T15:46:12.020 DEBUG:teuthology.parallel:result is None 2026-03-25T15:46:12.020 INFO:teuthology.task.install:Removing ceph sources lists on ubuntu@vm04.local 2026-03-25T15:46:12.021 DEBUG:teuthology.orchestra.run.vm04:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-25T15:46:12.045 DEBUG:teuthology.orchestra.run.vm04:> sudo mv -f /etc/yum/pluginconf.d/priorities.conf.orig /etc/yum/pluginconf.d/priorities.conf 2026-03-25T15:46:12.114 DEBUG:teuthology.parallel:result is None 2026-03-25T15:46:12.114 DEBUG:teuthology.run_tasks:Unwinding manager clock 2026-03-25T15:46:12.116 INFO:teuthology.task.clock:Checking final clock skew... 2026-03-25T15:46:12.116 DEBUG:teuthology.orchestra.run.vm04:> PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-25T15:46:12.169 INFO:teuthology.orchestra.run.vm04.stderr:bash: line 1: ntpq: command not found 2026-03-25T15:46:12.583 INFO:teuthology.orchestra.run.vm04.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-25T15:46:12.584 INFO:teuthology.orchestra.run.vm04.stdout:=============================================================================== 2026-03-25T15:46:12.584 INFO:teuthology.orchestra.run.vm04.stdout:^- 185.13.148.71 2 7 377 90 -180us[ -180us] +/- 18ms 2026-03-25T15:46:12.584 INFO:teuthology.orchestra.run.vm04.stdout:^+ time.cloudflare.com 3 7 377 92 -1034us[-1034us] +/- 15ms 2026-03-25T15:46:12.584 INFO:teuthology.orchestra.run.vm04.stdout:^- 130.162.237.177 1 7 377 91 -11ms[ -11ms] +/- 32ms 2026-03-25T15:46:12.584 INFO:teuthology.orchestra.run.vm04.stdout:^* ec2-18-192-244-117.eu-ce> 2 7 377 93 +1142us[+1145us] +/- 17ms 2026-03-25T15:46:12.584 DEBUG:teuthology.run_tasks:Unwinding manager ansible.cephlab 2026-03-25T15:46:12.586 INFO:teuthology.task.ansible:Skipping ansible cleanup... 2026-03-25T15:46:12.587 DEBUG:teuthology.run_tasks:Unwinding manager selinux 2026-03-25T15:46:12.589 DEBUG:teuthology.run_tasks:Unwinding manager pcp 2026-03-25T15:46:12.591 DEBUG:teuthology.run_tasks:Unwinding manager internal.timer 2026-03-25T15:46:12.593 INFO:teuthology.task.internal:Duration was 1296.236876 seconds 2026-03-25T15:46:12.593 DEBUG:teuthology.run_tasks:Unwinding manager internal.syslog 2026-03-25T15:46:12.595 INFO:teuthology.task.internal.syslog:Shutting down syslog monitoring... 2026-03-25T15:46:12.595 DEBUG:teuthology.orchestra.run.vm04:> sudo rm -f -- /etc/rsyslog.d/80-cephtest.conf && sudo service rsyslog restart 2026-03-25T15:46:12.665 INFO:teuthology.orchestra.run.vm04.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-25T15:46:13.150 INFO:teuthology.task.internal.syslog:Checking logs for errors... 2026-03-25T15:46:13.150 DEBUG:teuthology.task.internal.syslog:Checking ubuntu@vm04.local 2026-03-25T15:46:13.150 DEBUG:teuthology.orchestra.run.vm04:> grep -E --binary-files=text '\bBUG\b|\bINFO\b|\bDEADLOCK\b' /home/ubuntu/cephtest/archive/syslog/kern.log | grep -v 'task .* blocked for more than .* seconds' | grep -v 'lockdep is turned off' | grep -v 'trying to register non-static key' | grep -v 'DEBUG: fsize' | grep -v CRON | grep -v 'BUG: bad unlock balance detected' | grep -v 'inconsistent lock state' | grep -v '*** DEADLOCK ***' | grep -v 'INFO: possible irq lock inversion dependency detected' | grep -v 'INFO: NMI handler (perf_event_nmi_handler) took too long to run' | grep -v 'INFO: recovery required on readonly' | grep -v 'ceph-create-keys: INFO' | grep -v INFO:ceph-create-keys | grep -v 'Loaded datasource DataSourceOpenStack' | grep -v 'container-storage-setup: INFO: Volume group backing root filesystem could not be determined' | grep -E -v '\bsalt-master\b|\bsalt-minion\b|\bsalt-api\b' | grep -v ceph-crash | grep -E -v '\btcmu-runner\b.*\bINFO\b' | head -n 1 2026-03-25T15:46:13.211 INFO:teuthology.task.internal.syslog:Gathering journactl... 2026-03-25T15:46:13.225 DEBUG:teuthology.orchestra.run.vm04:> sudo journalctl > /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-25T15:46:13.683 INFO:teuthology.task.internal.syslog:Compressing syslogs... 2026-03-25T15:46:13.683 DEBUG:teuthology.orchestra.run.vm04:> find /home/ubuntu/cephtest/archive/syslog -name '*.log' -print0 | sudo xargs -0 --max-args=1 --max-procs=0 --verbose --no-run-if-empty -- gzip -5 --verbose -- 2026-03-25T15:46:13.706 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-25T15:46:13.706 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-25T15:46:13.707 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose --/home/ubuntu/cephtest/archive/syslog/kern.log: /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-25T15:46:13.707 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/kern.log.gz 2026-03-25T15:46:13.707 INFO:teuthology.orchestra.run.vm04.stderr:/home/ubuntu/cephtest/archive/syslog/misc.log: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/misc.log.gz 2026-03-25T15:46:13.865 INFO:teuthology.orchestra.run.vm04.stderr:/home/ubuntu/cephtest/archive/syslog/journalctl.log: 98.5% -- replaced with /home/ubuntu/cephtest/archive/syslog/journalctl.log.gz 2026-03-25T15:46:13.867 DEBUG:teuthology.run_tasks:Unwinding manager internal.sudo 2026-03-25T15:46:13.970 INFO:teuthology.task.internal:Restoring /etc/sudoers... 2026-03-25T15:46:13.981 DEBUG:teuthology.orchestra.run.vm04:> sudo mv -f /etc/sudoers.orig.teuthology /etc/sudoers 2026-03-25T15:46:14.006 DEBUG:teuthology.run_tasks:Unwinding manager internal.coredump 2026-03-25T15:46:14.066 DEBUG:teuthology.orchestra.run.vm04:> sudo sysctl -w kernel.core_pattern=core && sudo bash -c 'for f in `find /home/ubuntu/cephtest/archive/coredump -type f`; do file $f | grep -q systemd-sysusers && rm $f || true ; done' && rmdir --ignore-fail-on-non-empty -- /home/ubuntu/cephtest/archive/coredump 2026-03-25T15:46:14.090 INFO:teuthology.orchestra.run.vm04.stdout:kernel.core_pattern = core 2026-03-25T15:46:14.105 DEBUG:teuthology.orchestra.run.vm04:> test -e /home/ubuntu/cephtest/archive/coredump 2026-03-25T15:46:14.162 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-25T15:46:14.162 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive 2026-03-25T15:46:14.164 INFO:teuthology.task.internal:Transferring archived files... 2026-03-25T15:46:14.164 DEBUG:teuthology.misc:Transferring archived files from vm04:/home/ubuntu/cephtest/archive to /archive/kyr-2026-03-20_22:04:26-rbd-tentacle-none-default-vps/3645/remote/vm04 2026-03-25T15:46:14.164 DEBUG:teuthology.orchestra.run.vm04:> sudo tar c -f - -C /home/ubuntu/cephtest/archive -- . 2026-03-25T15:46:14.233 INFO:teuthology.task.internal:Removing archive directory... 2026-03-25T15:46:14.233 DEBUG:teuthology.orchestra.run.vm04:> rm -rf -- /home/ubuntu/cephtest/archive 2026-03-25T15:46:14.289 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive_upload 2026-03-25T15:46:14.292 INFO:teuthology.task.internal:Not uploading archives. 2026-03-25T15:46:14.292 DEBUG:teuthology.run_tasks:Unwinding manager internal.base 2026-03-25T15:46:14.294 INFO:teuthology.task.internal:Tidying up after the test... 2026-03-25T15:46:14.294 DEBUG:teuthology.orchestra.run.vm04:> find /home/ubuntu/cephtest -ls ; rmdir -- /home/ubuntu/cephtest 2026-03-25T15:46:14.347 INFO:teuthology.orchestra.run.vm04.stdout: 8532146 0 drwxr-xr-x 2 ubuntu ubuntu 6 Mar 25 15:46 /home/ubuntu/cephtest 2026-03-25T15:46:14.348 DEBUG:teuthology.run_tasks:Unwinding manager console_log 2026-03-25T15:46:14.353 INFO:teuthology.run:Summary data: description: rbd/cli_v1/{base/install clusters/{fixed-1} conf/{disable-pool-app} features/format-1 msgr-failures/few objectstore/bluestore-hybrid supported-random-distro$/{centos_latest} workloads/rbd_cli_generic} duration: 1296.2368757724762 flavor: default owner: kyr success: true 2026-03-25T15:46:14.353 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-25T15:46:14.377 INFO:teuthology.run:pass